Facebook’s arbitrary censorship imposes high costs on content moderators

While Facebook uses algorithms to automatically remove inappropriate content it also employs more than 15.000 employees to filter through user content (Sánchez and Sarabia 2019). Reporters Sánchez and Sarabia of the Spanish newspaper El Diario uncovered the horrific reality behind this practice by interviewing these content reviewers, discovering an arbitrary decision-making process and an environment in which “it is impossible not to make mistakes because the system is very contradictory” (ibid).

Facebook’s work teams are divided by content type and by language. Each moderator watches a screen that shows them the posts on Facebook that have accumulated substantial ‘complaints’ from users for being inappropriate. Every time that a relevant number of people denounce (‘report’, in Facebook language) a piece of content, it ends up coming to one of those screens. There are two options: ‘Delete’ to delete the content or ‘Ignore’ to leave it published (ibid).

While the training given to employees is usually quite simplistic, content moderation in practice becomes complex very quickly. On top of this, Facebook imposes very strict internal rules on its workers for each possible case. For example, they must make a decision regarding a piece of content in a short time span. In this way, content moderators find themselves overworked, confused and stressed.

Facebook’s content rules seem to operate as an often incongruous, patchwork system. According to an internal source, if a person makes comments to extol Hitler, it is deleted. However, a similar comment made by a Spaniard that praises Franco would be left published. Facebook allows fascism: “You can get a page full of photos of Mussolini or Franco and nothing happens, it’s totally allowed” (ibid). This likely stems from the fact that the apology of Francoism is not a crime in Spain however the apology of Nazism is a crime in Germany. While the rules of Facebook are not subject to the laws of any country, they do carry an undeniable American stamp. For example, regarding terrorist organizations, reviewers are given the US government’s list of terrorist groups prepared. While Hamas is not considered a terrorist group by the European Union (which has its own list of terrorist organizations) the US list is the one that matters for Facebook (Sánchez and Sarabia, 2019).

Internal sources emphasize the double standards and biased protection to specific groups over others regardless of what any written rules state. While “Insulting certain beliefs or ideologies is de facto allowed, there are some beliefs or ideologies that are specially protected on Facebook” (ibid). For example, Zionism has been found to receive partial treatment (ibid).

The interviewees conclude that the likelihood of content being erased by Facebook is directly proportional to the power and organizational capacity of the group that feels alluded to or attacked. The automatic systems for censorship are, Facebook recognizes, projects that require extended periods of time to develop and perfect. But for now, the mechanism largely relies on a post receiving a sufficient number of complaints and then a human review.

Facebook has faced criticism for the working conditions imposed on its moderators. They are exposed every day to hours of extremely disturbing content. While a psychologist is on staff for the workers of the team, it is not obligatory to meet with them. On top of this, a content reviewer is always under surveillance: if someone pulls out a bottle of water or phone from their pocket, they could be reported by their coworkers via and sanctioned by the bosses. For the entire working day, content moderators are only given a 30-minute break, which they have to distribute throughout the day in order to eat, use the bathroom, or stretch their legs (Sánchez and Sarabia, 2019). Facebook’s employees in Barcelona, ​​Warsaw or Lisbon meet the requirements of being labeled telemarketers and are usually hired through ‘customer service’ companies such as Competence Call Center in Barcelona. This loophole allows the company to legally distance itself from their working conditions, as well as refuse benefits given they are not full-time workers (ibid).