How do Facebook, Instagram and other social media sites protect you, the consumer, from disturbing content on the web? Many of us think our computers’ sophisticated algorithms perform such constant editing, but algorithms are not able to make subtle distinctions, e.g. between art and pornography. Rather, technology companies rely on people to do this work. They are the so-called Commercial Content Moderators (CCM).
While CCMs remain largely invisible, some scholars, journalists and artists have started to bring attention to this workforce and the toll editing the Internet takes on them.
There are four ways that internet companies moderate their content. They employ in-house content moderators. They hire public relations and advertising agencies, which offer subcontracted Commercial Content Moderation as an additional service. They use specialized firms such as TaskUS, MicroSourcing or Arvato, or they can post the work on micro-task websites such as Amazon Mechanical Turk. The latter are less popular since the workers cannot be monitored as well as in the other settings.
“Digital trash removal” is part of a global supply chain. Much of the work is done in developing countries, especially in the Philippines, though there are some businesses present in North America and Europe. There are no official statistics, but the estimate is that globally, more than 100,000 content moderators delete images, videos and texts of extreme violence and sexual perversion for 10 or more hours per day. The CCM workers in the Philippines are mostly young people, often college-educated. They earn barely more than the minimum wage.
These workers pay a high price for their jobs. They often suffer psychological conditions, including depression and paranoia. The images remain after the shift has ended, and many try to drown them in alcohol. It has become the practice in this little-known part of the labour market to get rid of these collaborators or employees after two years, before they have a break down and are incapable of continuing.
It is difficult to get information about the working conditions in CCM businesses, since workers who provide CCM services often have to sign a Non-Disclosure Agreement (NDA). They don’t have the right to talk about their work, not even to their partners, and risk severe penalties if they do so. The impact on relationships and social life can be devastating.
The criteria guiding the content moderators in their decisions, which are taken within seconds, are obscure and arbitrary. For example, images showing a naked breast are only allowed when the nipple is not showing. Crushed skulls are OK, but not spilled brains. Images showing violence in Syria are allowed, but those showing the same atrocities in Mexico are not.
The new type of work that CCM workers perform needs to be discussed when we talk about the future of work and digital labour. CCM raises concerns for at least two criteria of the Unacceptable Forms of Work (UFW), which are identified and defined by the ILO as work “in conditions that deny fundamental principles and rights at work, put at risk the lives, health, freedom, human dignity and security of workers or keep households in conditions of extreme poverty.” In the case of CCM work, the criteria of risk for workers’ health, security and dignity certainly applies.
The negative impact of CCM work also shows that the ILO needs to pay more attention to the psychological consequences of work, which are even more critical in the case of digital work. There will certainly be more issues emerging from CCM work that will have far-reaching and complex impacts on our societies. Whatever type of regulation might be possible, it can only deal with the symptoms since neither violence, nor sexual perversion, nor the Internet, will go away anytime soon.