The publication sifted through over 100 of the website's training manuals, spreadsheets, and flowcharts that show how the site polices its content - and it's pretty mind-boggling which type of content is allowed to stay up.
Files allegedly shared with Facebook staff and seen by the Guardian newspaper claim that in January the social site had to assess almost 54,000 cases of revenge pornography - 33 of which involved children. YouTube took the video down nearly immediately, but it took administrators two weeks to remove it from Facebook, where it received thousands of shares and comments.
The rules on violence, for instance, allows threats like "I'm going to kill you" or "F*** off and die", which it deems as not credible, but are seen as "a violent expression of dislike and frustration". People can live-stream attempts of self-harm because the platform doesn't "want to censor or punish people in distress".
Facebook's head of global policy management, Monika Bickert, said it was always going to be hard to create standards when things aren't necessarily black and white. These are not terms or words usually used on Newstalk.com, but we have chose to publish them to illustrate how the Facebook policing process works.
Facebook allows users to post videos of abortions, as long as they do not contain any nudity.
Documents instruct Facebook moderators to delete remarks such as "Someone shoot Trump", because as president of the United States he is in a protected category.
Photographic proof of physical abuse or the bullying of children does not have to be deleted unless there is a "sadistic or celebratory element".More news: North Korea links nuclear advances to hostile US policy
More news: Protesters hit Grove City streets as Pence speaks at commencement
More news: Media Stocks Slide As Investors Fear Turmoil From Trump Disclosures
Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as "disturbing".
Violent videos, including self-harm, did not always have to deleted because they were marked as disturbing and could create awareness.
Facebook has said in the past that it is in a unique position to do more about the suicide epidemic.
Along with adding more people to its team of moderators, Facebook says it will offer them more tools to help them respond faster.
However, the leaked documents also showed the company are taking measures to improve policies, even if it they are only implemented following public pressure.
In the documents, the company noted that people use violent language on Facebook, because they feel like it won't come back to them. "However, because of the contagion risk [i.e., some people who see suicide are more likely to consider suicide], what's best for the safety of people watching these videos is for us to remove them once there's no longer an opportunity to help the person". All the handmade art showing sexual activity and nudity is allowed on the platform; however, digitally-made art showing sexual activity is not allowed.
The investigation, based on the leaked files, also claims Facebook has only recently banned users from posting images that mock people with illnesses and other health issues.