Daniel Motaung remembers watching a video of a beheading while he worked as an outsourcedin Kenya. Viewing violent and graphic content, he said, ended up taking him to a place he never imagined.
“Now, I have a heightened fear of death because of the content that I’ve moderated on a daily basis. And because of that, my quality of life has changed drastically,” he said during a virtual discussion Tuesday. “I don’t look forward to going outside. I don’t look forward to going in public spaces.”
The discussion, titled “Facebook Content Moderation, Human Rights: Democracy and Dignity at Risk,” came on the same day that attorneys for the former content moderator filed a lawsuit against Facebook parent company Meta and Sama, the outsourcing firm that partners with the social media giant for content moderation in Africa. The 52-page petition alleges that the companies violated the Kenyan constitution, accusing them of forced labor, human trafficking, treating workers in a “degrading manner” and union-busting. Motaung was fired from his job in 2019 after he tried to form a trade union, the lawsuit said.
The lawsuit, filed in Nairobi’s employment and labor relations court, is the latest in ongoing criticism Meta has faced over the working conditions of content moderators. In 2020, the company reached aafter content moderators in the US sued Facebook for allegedly failing to provide them with a safe workplace. The social network, which has more than 15,000 moderators, has struggled to police offensive content in multiple languages worldwide.
Meta didn’t immediately respond to a request for comment. Suzin Wold, a spokesperson for Sama, said in a statement that the allegations against the company “are both inaccurate and disappointing.” She said the company has helped lift more than 59,000 people out of poverty, has provided workers a competitive wage and is a “longstanding, trusted employer in East Africa.”
The lawsuit alleges that Sama targets poor and vulnerable youth for content moderation jobs, coercing them into signing employment contracts before they really understand what the role entails. Motaung, who came from a poor family, was looking for a job to support his family after college and didn’t know that content moderation could harm his mental health, the lawsuit said. He then suffered from post-traumatic stress disorder, severe depression, anxiety, a relapse in his epilepsy and vivid flashbacks and nightmares from moderating graphic content.
Content moderators aren’t given enough mental health support, must deal with irregular pay and can’t discuss their struggles with family and friends because they’re required to sign a non-disclosure agreement, the lawsuit said.
“A Facebook moderator must make high-stakes decisions about extremely difficult political situations and even potential crimes — and they do so in a workplace setting that treats their work as volume, disposable work, as opposed to essential and dangerous front-line work protecting social media users. In short, Facebook moderators sacrifice their own health to protect the public,” the lawsuit said.
Motaung, who shared his story in February with Time, said Meta has passed the responsibility of protecting workers to outsourcing companies and is exploiting people for profit.
A group of Facebook critics called the Real Facebook Oversight Board, as well as Foxglove and The Signals Network, hosted Tuesday’s panel discussion. In a blog post, the groups urged Meta to offer outsourced content moderators the same level of pay, job security and benefits as its own employees. They’re also asking Meta to make other changes such as to publicize a list of the outsourcing companies it works with for content moderation.
Motaung said he believes that content moderation can be improved and has his own ideas as someone who has done the job.
“I’ve actually accepted the destruction of my own mental health and life in general, so what I’m hoping to achieve is to change that because I believe that content moderators can be dealt with in a better way,” he said.