Business is booming.

‘You can’t unsee it’: the content moderators taking on Facebook

[ad_1]

By his own estimate, Trevin Brownie has seen more than 1,000 people being beheaded.

In his job, he had to watch a new Facebook video roughly every 55 seconds, he says, removing and categorising the most harmful and graphic content. On his first day, he recalls vomiting in revulsion after watching a video of a man killing himself in front of his three-year-old child.

After that things got worse. “You get child pornography, you get bestiality, necrophilia, harm against humans, harm against animals, rapings,” he says, his voice shaking. “You don’t see that on Facebook as a user. It is my job as a moderator to make sure you don’t see it.”

After a while, he says, the ceaseless horrors begin to affect the moderator in unexpected ways. “You get to a point, after you’ve seen 100 beheadings, when you actually start hoping that the next one becomes more gruesome. It’s a type of addiction.”

Brownie is one of several hundred young people, most in their 20s, who were recruited by Sama, a San Francisco-based outsourcing company, to work in its Nairobi hub moderating Facebook content.

A South African, he is now part of a group of 184 petitioners in a lawsuit against both Sama and Facebook owner Meta for alleged human rights violations and wrongful termination of contracts.

The case is one of the largest of its kind anywhere in the world, but one of three being pursued against Meta in Kenya. Together, they have potentially global implications for the employment conditions of a hidden army of tens of thousands of moderators employed to filter out the most toxic material from the world’s social media networks, lawyers say.

In 2020, Facebook paid out $52mn to settle a lawsuit and provide mental health treatment for American content moderators. Other cases filed by moderators in Ireland have sought compensation for alleged post-traumatic stress disorder.

Legal counsel at a desk with a laptop in full courtroom
Mercy Mutemi and fellow counsel follow proceedings during a virtual pre-trial consultation last month. Should the Kenyan moderators’ case against Meta succeed, it could change working conditions in many more places © Tony Karumba/AFP/Getty Images

But the Kenyan cases are the first filed outside the US that seek to change through court procedures how moderators of Facebook content are treated. Should they succeed, they could lead to many more in places where Meta and other social media providers screen content through third-party providers, potentially improving conditions for thousands of workers paid comparatively little to expose themselves to the worst of humanity.

Just as toiling on factory floors or inhaling coal dust destroyed the bodies of workers in the industrial age, say the moderators’ lawyers, so do those working on the digital shop floor of social media risk having their minds ruined.

“These are frontline issues for this generation’s labour rights,” says Neema Mutemi, a lecturer at University of Nairobi who is helping to publicise the case. Asked to respond to the allegations, Meta said it does not comment on ongoing litigation.

Online harms

In recent years, Meta has come under increasing pressure to moderate vitriol and misinformation on its platforms, which include Facebook, WhatsApp and Instagram.

In Myanmar, it faced accusations that its algorithms amplified hate speech and that it failed to remove posts inciting violence against the Rohingya minority, thousands of whom were killed and hundreds of thousands of whom fled to Bangladesh.

In India, experts claimed it failed to suppress misinformation and incitement to violence, leading to riots in the country, its largest single market.

In 2021, whistleblower Frances Haugen leaked thousands of internal documents revealing the company’s approach to protecting its users, and told the US Senate the company prioritised “profit over safety”.

Meta failed particularly to filter divisive content and protect users in non-western countries such as Ethiopia, Afghanistan and Libya, the documents showed, even when Facebook’s own research marked them “high risk” because of their fragile political landscape and frequency of hate speech.

Frances Haugen arrives to testify during a US Senate Committee hearing
Former Facebook employee and whistleblower Frances Haugen testified before the US Senate in 2021 that the company prioritised ‘profit over safety’ © Drew Angerer/Pool/Reuters

In the past few years, Meta has invested billions of dollars to tackle harms across its apps, recruiting about 40,000 people to work on safety and security, many contracted through third-party outsourcing groups such as Accenture, Cognizant and Covalen.

An estimated 15,000 are content moderators. Outside the US, Meta works with companies in more than 20 sites around the world, including India, the Philippines, Ireland and Poland, who now help sift content in multiple foreign languages.

In 2019, Meta requested that Sama — which had been working in Nairobi for several years on labelling data to train artificial intelligence software for clients including Meta and Tesla — take on the work of content moderation. It would be part of a new African hub, to focus on filtering African language content.

Sama says it had never done this type of work previously. But its team on the ground supported taking on the work, which might otherwise have gone to the Philippines, out of a sense of responsibility to bring cultural and linguistic expertise to the moderation of African content. It set about hiring people from countries including Burundi, Ethiopia, Kenya, Somalia, South Africa and Uganda to come and work at its facilities in Nairobi.

It was to prove a mistake. Within four years of starting content moderation, Sama decided to get out of the business, ending its contract with Facebook and firing some of the managers who had overseen the new work.

Brownie, who had been recruited in 2019 in South Africa to work at the Nairobi hub, was among those given notice this January when Sama told its employees it would no longer be moderating Facebook content.

“It is important work, but I think it is getting quite, quite challenging,” Wendy Gonzalez, Sama’s chief executive, tells the FT, adding that content moderation had only ever been 2 per cent of Sama’s business. “We chose to get out of this business as a whole.”

Many of the moderators working in Kenya say the work leaves them psychologically scarred, plagued by flashbacks and unable to maintain normal social relations.

“Once you have seen it you can’t unsee it. A lot of us now, we can’t sleep,” says Kauna Ibrahim Malgwi, a Nigerian graduate of psychology who started at Sama’s Nairobi hub in 2019 and moderated content in the Hausa language spoken across west Africa. She is now on antidepressants, she says.

Cori Crider, a director at Foxglove, a London-based non-profit legal firm that is supporting former Sama moderators with their case, says moderators receive wholly inadequate protection from mental stress.

Mercy Mutemi turns in her seat to smile at moderators sitting behind her
Moderators in Nairobi this month voted to form a union — what their lawyer Mercy Mutemi says is the first of its kind in the world © Favier/Foxglove

“Policemen who investigate child-abuse imagery cases have an armada of psychiatrists and strict limits on how much material they can see,” she says. But the counsellors employed by Sama on Meta’s behalf “are not qualified to diagnose or treat post-traumatic stress disorder,” she alleges. “These coaches tell you to do deep breathing and finger painting. They are not professional.”

Sama says all the counsellors it employed had professional Kenyan qualifications.

Meta argued that Kenya’s courts had no jurisdiction in the case. But on April 20, in what the moderators and their lawyers saw as a major victory, a Kenyan judge ruled that Meta could indeed be sued in the country. Meta is appealing.

“If Shell came and dumped things off Kenya’s coast, it would be very obvious whether or not Kenya has jurisdiction,” says Mercy Mutemi, a Kenyan lawyer at Nzili and Sumbi Advocates, who is representing the moderators. “This is not a physical, tangible thing. This is tech. But the argument is the same. They’ve come here to do harm.”

Working conditions

The case of the 184 moderators is one of three lawsuits filed on behalf of content moderators by Mutemi’s law firm with Foxglove’s support.

The first was lodged last year on behalf of Daniel Motaung, a South African moderator working in Nairobi, against both Sama and Meta. In that case too, a separate Kenyan judge dismissed Meta’s contention that Kenyan courts had no jurisdiction.

Motaung alleges he was wrongfully dismissed after he tried to form a union to press for better pay and working conditions. He also claims to have been lured into the job under false pretences, unaware of exactly what it entailed.

Sama disputes these claims, saying that content moderators were acquainted with the job during their hiring and training process, and that Motaung was sacked because he had violated the company’s code of conduct. “As far as the union being formed, we have policies in place for freedom of association,” says Gonzalez. “If a union was being formed, that is not a problem.”

Content moderators recruited from outside Kenya were paid about Ks60,000 a month, including an expat allowance, equivalent to about $564 at 2020 exchange rates.

Daniel Motaung wearing a collared shirt and suit jacket stands with arms crossed.
Daniel Motaung, a South African moderator working in Nairobi, filed a lawsuit against Sama and Meta alleging he was fired for trying to form a union © Favier/Foxglove

Moderators typically worked a nine-hour shift, with an hour’s break, two weeks on days and two weeks on nights. After tax, they received an hourly wage of roughly $2.20.

Sama says those wages were several times the minimum wage and equivalent to the salary received by Kenyan paramedics or graduate level teachers. “These are meaningful wages,” says Gonzalez.

The data suggests the wages for expat workers are just over four times Kenya’s minimum wage, but Crider from Foxglove says she is not impressed: “$2.20 an hour to put yourself through repeated footage of murder, torture and child abuse? It’s a pittance.”

Haugen, the Facebook whistleblower, said Motaung’s struggle for workers’ rights was the digital-era equivalent of previous struggles. “People fighting for each other is why we have the 40-hour work week,” she said, speaking at an event alongside Motaung in London last year. “We need to extend that solidarity to the new front, on things like content-moderation factories.”

This month, moderators in Nairobi voted to form what their lawyers say is the first union of content moderators in the world. Motaung called the resolution “a historic moment”.

The last of the three cases being heard in Kenya deals not with labour law, but with the alleged consequences of material posted on Facebook. It claims that Facebook’s failure to deal with hate speech and incitement to violence fuelled ethnic violence in Ethiopia’s two-year civil war which ended in November.

Crider says the three cases are related because poor treatment of content moderators results directly in unsafe content being left to spread unchecked by Meta’s platforms.

Close up of Abrham Meareg looking to his left
Abrham Meareg, the son of an Ethiopian academic shot dead after being attacked in Facebook posts, has brought a case against Meta over its alleged failure to deal with hate speech © Foxglove

One of two plaintiffs, researcher Abrham Meareg, alleges that his father, a chemistry professor, was killed in Ethiopia’s Amhara region in October 2021 after a post on Facebook revealed his address and called for his murder. Abrham says he asked Facebook multiple times to remove the content, without success.

Sama employed around 25 people to moderate content from Ethiopia in three languages — Amharic, Tigrinya and Oromo — at the time of a conflict that stirred ethnic animosity and may have claimed up to 600,000 lives.

Lawyers are seeking the establishment of a $1.6bn victims’ fund and better conditions for future content moderators. Crucially, they are also asking for changes to Facebook’s algorithm to prevent this happening elsewhere in future.

Lawyers say that to compete with other platforms, Facebook deliberately maximises user engagement for profit, which can help unsafe or hazardous content go viral.

“Abrham is not an outlier or a one-off,” says Rosa Curling, a director at Foxglove. “There are endless examples of things being published on Facebook, [calls for people] to be killed. And then that, in fact, happening.”

Curling says the quality of Facebook moderation in the Nairobi hub is affected by the working practices now being challenged in court.

Gonzalez of Sama acknowledges that regulation of content moderation is deficient, saying the issue should be “top of mind” for social media company chiefs. “These platforms, and not just this one [Facebook] in particular, but others as well, are kind of out in the wild,” she says. “There need to be checks and balances and protections put in place.”

Tigrayan civilians cheers as captive Ethiopian solders walk past them
Captive Ethiopian soldiers walk past cheering crowds in Mekele, capital of the Tigray region, in 2021. A court in Kenya was told that material posted on Facebook had fuelled ethnic violence during Ethiopia’s two-year civil war © Yasuyoshi Chiba/AFP/Getty Images

While Meta contracts tens of thousands of human moderators, it is already investing heavily in their replacement: artificial intelligence software that can filter misinformation, hate speech and other forms of toxic content on its platforms. In the most recent quarter, it said that 98 per cent of “violent and graphic content” taken down was detected using AI.

However, critics point out that the overwhelming amount of harmful content that remains online in places like Ethiopia is evidence that AI software cannot yet pick up the nuances required to moderate images and human speech.

‘Not a normal job’

As well as potentially setting legal precedent, the cases in Kenya offer a rare glimpse into the working lives of content moderators, who normally toil away in anonymity.

The non-disclosure agreements they are required to sign, usually at the behest of contractors like Sama, forbid them from sharing details of their work even with their families. Gonzalez says this is to protect sensitive client data.

Frank Mugisha, a former Sama employee from Uganda, has another explanation. “I’ve never had a chance to share my story with anyone because I’ve always been kept a dirty secret,” he says.

Following the loss of their jobs, Sama employees from outside Kenya now face the possibility of expulsion from the country, though a court has issued an interim injunction preventing Meta and Sama from terminating the moderators’ contracts until a judgment is made on the legality of their redundancy.

Still, several former Sama employees have not been paid since April, when the company terminated its contract with Meta, and face eviction for non-payment of rent.

All the content moderators who spoke to the FT had signed non-disclosure agreements. But their lawyers said these did not prevent them from discussing their working conditions.

Kenyan riot police monitor a demonstration by Facebook content moderators outside Sama’s offices in Nairobi earlier this month
Kenyan riot police monitor a demonstration by Facebook content moderators, who are involved in a redundancy case, outside Sama’s offices in Nairobi earlier this month © Daniel Irungu/EPA-EFE

Moderators from a range of countries across Africa were consistent in their criticisms. All said they had taken on the job without being properly informed about what it entailed. All complained of constant pressure from managers to work at speed, with a requirement to deal with each “ticket”, or item, in 50 or 55 seconds.

Meta said that it does not mandate quotas for content reviewers, and said they “aren’t pressured to make hasty decisions”, though it said “efficiency and effectiveness” are important factors in the work.

Malgwi, the Nigerian psychology graduate, is dismissive of what moderators allege is Facebook’s attempt to keep its distance by using third-party companies like Sama. “We log in every morning to Meta’s platform,” she says. “You see: ‘Welcome. Thank you for protecting the Meta community’.”

Fasica Gebrekidan, an Ethiopian moderator who studied journalism at Mekelle university, got a job at Sama shortly after fleeing Ethiopia’s civil war in 2021. After learning she would be working indirectly for Meta, she thought “maybe I’m the luckiest girl in the world,” she says. “I didn’t expect dismembered bodies every day from drone attacks,” she adds.

Until now, Gebrekidan has not spoken to anyone, shielding the nature of her work even from her mother. “I know what I do is not a normal job,” she says. “But I consider myself a hero for filtering all this toxic, negative stuff.”

[ad_2]

Source link