A new report claims to reveal for the first time ever Facebook's secret content removal policies.
Excerpts of internal documents that the company allegedly hands out to both its own staff and third-party content moderators were provided to German newspaper Sueddeutsche Zeitung by unidentified sources. Although the company's guidelines are touched upon on its website, the information it provides to its workers offers much more detail.
The chapter that stands out covers Facebook's stance on hate speech, an issue that itself has become a matter of contention in Germany, where the social network is currently facing a lawsuit regarding its alleged inaction on the matter.
The documents reveal a convoluted hate speech policy that contains a number of loopholes resulting from the criteria Facebook uses to determine what constitutes hateful rhetoric.
Facebook does not permit "verbal attacks" on a "protected category," according to the documents. These self-determined categories are currently based on a number of factors, including: sex, religious affiliation, gender, race, ethnicity, sexual orientation, national origin, disability or serious illness. Some of these groups contain sub-categories that receive extra protection (for example, under "age," criteria such as "youth" and "senior citizen" receive priority).
An overview at the end of the hate speech chapter is where things start to get a bit muddled. A sentence reportedly containing an expletive directly followed by a reference to a religious affiliation (for example: "f*cking Muslims") is not allowed. However, the same does not go for the term "migrants," as migrants are allegedly only a "quasi protected category." Additionally, Facebook reportedly allows for posts that could be deemed hateful against migrants under certain circumstances. For example, a statement such as "migrants are dirty" is allowed, whereas "migrants are dirt" isn't.
We reached out to Facebook to verify the accuracy of the documents, but did not immediately receive a response. If they do turn out to be official, then the above examples could raise alarm bells for German authorities, seeing as the term "migrants" was only added to the list of criteria following public pressure in the country. Earlier this week, German Justice Minister Heiko Maas also urged an immediate crackdown on hate speech disseminated through social media sites, such as Facebook.
A related report in the same German daily provides an in-depth look at the inner workings of Facebook's Berlin-based content moderation team. In it, several members of the company's 600-strong staff (which also includes employees outsourced from a Bertelsmann business services unit) claim to have suffered psychological issues as a result of the material they were exposed to. "I've seen things that made me seriously question my faith in humanity," said one anonymous worker. The report claims that the troubled workers were not provided access to professional help.
Another employee describes the tortuous guidelines Facebook allegedly has in place: "The rules are almost impossible to understand. I've said to my team leader: this is crazy! The picture is full of blood and brutality, no one should have to see that. But he said: that's just your opinion. You have to try and think about what Facebook wants. We're expected to think like machines."