The Guardian publishes internal Facebook guidelines for moderators

Spread the love

The Guardian has published more than a hundred documents in a collection it calls the Facebook Files. These include manuals and presentations that contain guidelines on how moderators should handle certain content.

The British newspaper writes that it concerns internal rules that were handed out to Facebook moderators in the past year. The documents would show that Facebook is struggling to deal with the large amount of content on its social network. In addition, the moderators would have to rate too much content, which they sometimes only have ten seconds to do. In addition, many moderators would have reservations about the rules, which are sometimes complex and contradictory.

The Guardian gives some examples from the rules, for example about threats. For example, it is not allowed to call for the shooting of President Trump, but it is allowed to explain how to break someone’s neck. According to the rules, the latter is not an actual threat. In detail, the guidelines state that “violent language is often not credible until specific language gives rise to a suspicion that there is no longer an expression of emotion, but a transition to a plan.” In addition, videos showing an abortion are allowed, as long as they don’t show nudity. It is also allowed to broadcast a stream in which self-mutilation takes place, because Facebook does not want to punish or censor people who are in need.

Other topics covered include revenge porn, animal cruelty, hate speech and child abuse of a non-sexual nature. The documents are said to provide insight into the rules the social network drew up after political pressure from the US and the EU, following reports of fake news. Monika Bickert, who heads Facebook’s global policy department, told the newspaper that the social network has nearly two billion users and that it is difficult to agree on what content is allowed. Whether something is offensive would often depend on context and despite the rules set, there would always be gray areas.

It recently emerged that the EU is considering rules regarding the removal of hate messages on the internet, including by Facebook. Germany introduced a bill in April that would require social networks to delete messages labeled as “hate crime” within a day of a report. If they do not, they risk a fine of up to 50 million euros. In 2016, the EU, together with a number of tech companies, drew up a code of conduct against hate speech on the internet. At the end of that year, the Commission expressed its displeasure at the companies’ failure to react quickly enough. Earlier this month, it was announced that Facebook is hiring 3,000 people to respond more quickly to unwanted posts.

You might also like
Exit mobile version