Leaked Facebook documents show how the social-media company moderates issues such as hate speech, terrorism, pornography and self-harm on its platform, The Guardian reported, citing internal guidelines seen by the newspaper.

New challenges such as “revenge porn” have overwhelmed Facebook’s moderators who often have just 10 seconds to make a decision, The Guardian said. The social-media company reviews more than 6.5 million reports of potentially fake accounts a week, the newspaper added.

Many of the company’s content moderators have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing. Facebook had no specific comment on the report, but said safety was its overriding concern. “Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously”, Facebook’s Head of Global Policy Management Monika Bickert said in a statement.

Facebook confirmed that it was using software to intercept graphic content before it went on the website, but it was still in its early stages.

The leaked documents included internal training manuals, spreadsheets and flowcharts, The Guardian said.

The newspaper gave the example of Facebook policy that allowed people to live-stream attempts to self-harm because “it doesn’t want to censor or punish people in distress.”

Facebook moderators were recently told to escalate to senior managers any content related to 13 Reasons Why, a Netflix original drama series based on the suicide of a high school student, because it feared inspiration of copycat behaviour, the newspaper reported.

comment COMMENT NOW