[ad_1]
On July 6, 2020, as the prominent Iraqi historian Hisham al-Hashimi was walking toward his vehicle outside his home in eastern Baghdad, a group of gunmen sped toward him on motorcycles and fired five times at point-blank range. By the time he reached the hospital, al-Hashimi was dead.
Often an outspoken critic of the country’s militia groups, al-Hashimi was deeply connected within Iraq’s various power factions, from ISIS to government and opposition politicians.
In the months leading up to his death, viral posts on Facebook and other social-media sites accused al-Hashimi of being a spy for US, Israeli, or British forces, as well as conspiring to further destabilize Iraq. At the time, Iraq was going through the largest civil uprising in the country since the US invasion in 2003.
Al-Hashimi knew that one of his closest friends, Aws al-Saadi, the founder of the nonprofit Tech4Peace, was a Meta “trusted partner” and had a direct line to the company to help remove content like the threats against his life. In September 2019, al-Hashimi reached out to al-Saadi to ask whether he could help remove the harmful posts targeting him on Facebook.
Al-Saadi did what he could, but the responses from Meta were inconsistent. Some posts were removed within a day, while others stayed up for as long as a week.
One post from April 2019, which wrongly claimed al-Hashimi was an al-Qaida leader, was never taken down. Al-Saadi flagged it, but Meta replied that the post did not violate the company’s policies.
On July 6, 2020, the day of al-Hashimi’s death, al-Saadi wrote back: “They killed him now.”
“One of the reasons for his killing was Meta,” al-Saadi told Insider.
A draft report indicates problems within Meta’s trusted-partner network are all to common
Al-Saadi isn’t alone. Facebook and its parent company, Meta, have faced fierce criticism for failing to adequately moderate the platform.
In many countries, Meta relies on reporting from local civil-society groups and experts to flag hate speech and misinformation on Facebook. The trusted-partner program gained steam in the late 2010s after Meta drew criticism for its role in proliferating genocide in Myanmar, and the firm now considers the program essential to its moderation strategy in politically fraught countries, such as Iraq.
A draft report by the media nonprofit Internews, obtained by Insider, has concluded that the lapses in Meta’s trusted-partner program are putting people at grave risk. The group is one of Meta’s largest trusted partners and receives funding from Meta for various projects.
Meta did not respond to a request for comment on either the death of al-Hashimi or Internews’ report.
It did, however, provide three pages of answers to Internews’ questions, which were included in the draft report. In those comments, Meta acknowledged that COVID-19 had “severely impacted” its operations, which “resulted in poor reporting experiences” for its trusted partners from 2019 to 2021.
“During this period our content review teams operated at limited capacity and were unable to respond as quickly to trusted partner channel reports as we would like and as they have done in the past,” the company said. “Under these difficult circumstances, we prioritized the most harmful content for our teams to review, such as risk of imminent physical harm or violence.”
The stakes here are the protection of individuals, how countries are able to sustain their political systems, or risks to public healthPaul Barrett, Deputy Director of the Center for Business and Human Rights, NYU Stern School of Business
Meta added that its response times had improved in 2022 and that it expected an action time of between one and five days. More complex cases — Meta says trusted-partner reports often are complicated — the response time can be longer.
Four hundred sixty-five organizations have been enrolled in Meta’s trusted-partner program, it said, and it has at least one trusted partner in 122 countries. Meta, which previously told Insider that the program launched in 2012, said in its comments to Internews that it didn’t formalize its processes until 2019.
Internews’ report echoed the findings of an Insider investigation from earlier this year that revealed how Meta ignored, or was catastrophically delayed in reacting to, alarms sounded by its trusted partners in Ethiopia as two violent conflicts were underway in the country.
Meta’s responses to trusted partners around the world have been “erratic”
The report, based on a survey of 24 trusted partners, including al-Saadi, suggests that trusted partners working around the world are met with severely delayed and “erratic” response times when flagging hate speech and other harmful content, as well as imminent threats to people’s lives.
Partners have at times waited weeks, if not months, for a response from Meta, the report found. In some cases, partners received no responses at all. Frustrations from this have led some partners to ditch the program.
Al-Saadi and other partners found a work-around: appealing directly to personal contacts at Meta over WhatsApp or Signal, which often had better success than the dedicated reporting channel for trusted partners.
Paul Barrett, the deputy director of the Center for Business and Human Rights at New York University’s Stern School of Business, said the consequences of Meta ignoring its trusted partners were all too often dire.
“The stakes here are the protection of individuals, how countries are able to sustain their political systems, or risks to public health,” Barrett said.
Partners are involved in surveying everything from political hate speech and disinformation to vaccine misinformation.
One exception was Ukraine. There, the report said, partners experienced better response times from Meta, with an average of about 72 hours. By contrast, during the war in the Tigray region of Ethiopia, partners could wait months on end and receive no response.
I think it’s reasonable to say if partners were happier with the program, they’d report more often.Rafiq Copeland, Senior Advisor for Global Platform Accountability at Internews
Rafiq Copeland, a senior advisor at Internews and one of the authors of the report, said Meta did not provide an explanation for the disparity between response times for Ukrainian and Ethiopian partners.
“I think we have to assume it’s a matter of priorities and resourcing,” Copeland said.
The Internews report also said that these failures to respond to trusted partners and take down content could partly be attributed to the program being “significantly under-resourced and understaffed,” a problem that has been compounded by the company’s most recent layoffs.
Internews found that participating in the trusted-partner program, in many cases, increased the risks to trusted partners themselves — corroborating the stories of several trusted partners in Ethiopia who told Insider they received death threats as a result of their work.
Trusted partners around the world are ditching the program
In response to the death of al-Hashimi in Iraq, in emails viewed by Insider, Meta sent condolences and asked al-Saadi to participate in a meeting where he could provide the company more feedback on its processes.
At the meeting, distraught over his friend’s death, al-Saadi said he told the company: “Why are you asking me for my opinion if you aren’t going to do anything about it?”
Over the next few weeks, overcome with sadness and second-guessing what he could have done differently, he temporarily swore off reporting content.
Internews said trusted partners submitted only about 1,000 reports a month to Meta.
“The partners we spoke to were shocked when they heard that number,” said Copland, who acknowledged this may be an underestimate as it didn’t capture reports filed through informal channels such as WhatsApp or Signal. “Mainly because they assumed that the slow response times had to do with a high volume of cases.”
Not all of those cases are acted upon.
“When we see there’s only 33 cases a day, that tells us that the system’s failures are not directly related to volume,” Copeland said.
One explanation for this low figure might be that partners aren’t using the program because of their poor experiences with it.
“I think it’s reasonable to say if partners were happier with the program, they’d report more often,” Copeland said.
Meta does not have a way for trusted partners to easily escalate reports when someone’s life is in danger
Meta’s policies, the report found, have been opaque and inconsistent when partners have flagged posts threatening activists, journalists, or other human-rights defenders. Partners sometimes received responses saying no actions would be taken. But when they appealed through a personal contact at Meta, those decisions were often reversed.
Insider previously reported that an Ethiopian trusted partner flagged the risks of viral hate speech targeting professor Meareg Amare through both the trusted-partner channel and subsequent zoom meetings with Meta, but the platform failed to act. This investigation is now critical evidence in a $1.6 billion hate-speech lawsuit against Facebook.
In February, al-Saadi himself was targeted by viral misinformation claiming he was an American collaborator working to destabilize Iraq. Much like what happened to his friend al-Hashimi, he knew such accusations could jeopardize his life. Al-Saadi reported the posts to Meta on February 17. A week later, he escalated the issue to a personal contact at Meta who said they’d look into it but didn’t follow up. The posts were removed on April 18, two months after they were first reported.
“There needs to be a case-management system which allows for those really urgent cases to be identified and dealt with quickly, which one way or another doesn’t seem to exist,” Copeland said.
If he were still in Iraq, al-Saadi, who now lives in the Netherlands, said he would have more strongly feared for his life.
“You laugh, but at the same time, you cry about it,” he said.
Partners are calling for the system to be overhauled
Copeland remains hopeful that Meta will reform the trusted-partner program. Internews said Meta needed to redesign the program with more direct collaboration with its trusted partners.
Internews hopes this might help create a trusted-partner channel with improved communication, more transparency, and faster response times.
“I think Meta needs to commit to reform and a genuine codesign of the program,” Copeland said. “So far, we haven’t seen that commitment, but it’s a work in progress.”
Additional reporting by Reem Makhoul
[ad_2]
Source link