Facebook's under fire (again) after a leak of secret documents reveals moderation rules with mixed approach to violence, sex and abuse

 
Lynsey Barber
Follow Lynsey
Facebook Exhibits Technologies At Innovation Hub
Facebook is facing a tough job on moderatIng content (Source: Getty)

Facebook is under fire again, as secret documents used by the tech giant to guide how it moderates extreme content and abuse were leaked to the Guardian.

The revelations of how the website moderates content that can be considered abusive, violent or sexualised - including pornography and self-harm - appear inconsistent and confusing.

The comment "I hope someone kills you", for example, is permitted under the rules, according to the documents which the Guardian claim were given to moderators within the last year. The threat is not considered credible, according to the rules, and neither is "fuck off and die". But, "someone shoot Trump" should be deleted as a head of state comes under a protected category.

Read more: Facebook's ramping up its fake news fight ahead of the election... in print

And those employed as moderators are sometimes given just seconds to make a decision on whether a post should be removed or not, the report claims.

Facebook has long insisted it is a tech company and not a media company, but has ramped up its efforts to tackle problems such as fake news.

The social network promised to hire 3,000 more people to screen videos and images following furore over murders being live-streamed via Facebook Live.

Read more: Facebook is hiring 3,000 people to screen videos and live streams

And it comes after the Conservative Party promised to regulate the internet in its manifesto ahead of the General Election. It wants social media companies to pay a levy to help police them and threatened fines against them if they fail to remove illegal content.

MPs at home and lawmakers across Europe have put growing pressure on Facebook and other tech companies in recent months over how it tackles moderating content.

However, experts have raised concerns over the Conservative's proposals.

"On online content, we have seen some extremely carefully worded commitments around extreme content on social media, which focus on the responsibility of platforms to enable reporting of content and explain when content is not removed," said experts at Demos.

"The language used for these promises are worryingly vague: illegal content yes, but 'sources of harm' is extremely unclear. It also discusses a responsibility for sites not to direct users to this content. If this is enacted, expect a heated debate over what precisely constitutes 'direction'."

Facebook's head of global policy management Monike Bickert told the newspaper it was difficult to reach a consensus on what to allow with nearly two billion users.

Related articles