Facebook provides insight into how moderators work
Facebook has set out how it selects moderators, and how they act when reviewing content that violates the network’s rules. Among other things, the social networking site wants to show how the moderators are supported in their work.
According to Facebook’s explanation, about 20,000 people now work in the safety and security department. Of these, approximately 7,500 work as moderators within the community; these are the people who rate posts when they are reported by other users on Facebook. In a post on the Facebook blog, the company explains how the team of moderators works and how they are trained on their duties.
For example, Facebook states that weeks of training are needed to prepare moderators for their tasks. For example, an explanation is first given about the tasks, after which future moderators receive training in practice on the basis of an instructor. The students spend at least 80 hours on the latter part. In addition, there are regular coaching sessions where moderators receive feedback.
Moderators are automatically assigned reports to review. This takes into account, among other things, their location and time zone, and the language they speak. According to Facebook, attention is also paid to the local culture: Facebook prefers to have someone from the country itself to assess a post because they can better estimate local customs.
Facebook says it has hired four psychologists to help moderators deal with shocking content. There is also access to medical care for those who need it. With the explanation, Facebook probably wants to provide more insight into how the assessment of content works; normally the social networking site does not reveal much about this. Facebook does not reveal much about the personal circumstances of the moderators, mainly because of their safety.
The company had previously been criticized for its use of moderators; Internal documents would indicate that moderators have to review too much content, sometimes having only ten seconds. In addition, many moderators would have reservations about the rules, which are sometimes complex and contradictory. They also regularly suffer from psychological problems. Facebook probably wants to show its side of the story with the explanation.