Development and
Cooperation

Content moderation

Sorting out human horror for less than two dollars an hour

In Kenya, 185 former content moderators are currently suing Meta, the parent company of Facebook, Whats-App and Instagram – and its outsourcing company Sama. They accuse them of violating workers’ rights. Many of those affected are severely traumatised.
Mercy Mutemi is one of the plaintiffs’ lawyers. picture alliance/AA/Gerald Anderson
Mercy Mutemi is one of the plaintiffs’ lawyers.

While most of us scroll through our Facebook accounts without encountering any obscene content, you can be sure that there is someone out there who views and deletes such content, risking their mental well-being for a living. Behind every deleted video containing violence, abuse or hate speech, or any other content that violates Meta’s community standards, is a moderator who, under time pressure and for little money, weeds out the most harmful posts. 

In February 2022, Time Magazine first published the story of Daniel Moutang, a former employee of Sama who was fired when he tried to form a union and lead a strike to address issues such as worker exploitation and low wages. The text states that local Sama employees received a net wage of about $ 1.46 per hour after taxes, based on a 45-hour week. 

After Moutang exposed “Facebook’s African sweatshop,” as the Time Magazine’s headline read, more and more deeply distressed content moderators came forward, reporting breakdowns at work after witnessing, for example, a man being dismembered limb by limb in a video. However, these breakdowns could not last long because supervisors regularly ordered them back to their screens, regardless of their mental state.

A total of 144 employees were diagnosed with post-traumatic stress disorder. All diagnoses were made by Ian ­Kanyanya, head of psychiatric services at Kenyatta National Hospital. In the case files, Kanyanya stated that the moderators he examined regularly witnessed extremely graphic content, including videos of atrocious murders, suicides, self-harm, explicit sexual content and child abuse.

Meta in denial

In a landmark decision, the Kenyan Court of Appeal ruled last September that content moderators can take their human-rights abuse claims against Meta to the country’s labour courts. Mercy Mutemi, a Kenyan lawyer representing a group of former Meta content moderators, wrote in Al ­Jazeera in early April that the content moderators were hired by Sama and worked exclusively as content moderators on Facebook, Instagram, WhatsApp and Messenger from 2019 to 2023. Meta denies this and insists that they were employed entirely by Sama, which is headquartered in San Francisco.

The outsourcing model has allowed large tech companies to escape legal action. Nerima Wako-Ojiwa, a Kenyan political analyst and civil-rights activist, said on CBS News that Kenya’s labour laws needed to be updated to recognise digital work – but not only in Kenya. “Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies … shut down, and they move to a neighbouring country,” she told the American news service.

Alba Nakuwa is a freelance journalist from South Sudan based in Nairobi.
albanakwa@gmail.com 

Newest Articles

Most viewed articles