Meta, formerly known as Facebook, has always had a troubled history of regulating harmful content on its platforms. However, a new report from Internews has highlighted the company’s operational failures within its Trusted Partner program, which is responsible for helping organisations report dangerous and harmful content on Meta.
As per the report, the Trusted Partner program, which includes over 456 global civil society and human rights groups, has been facing significant obstacles, including prolonged delays of several months in receiving responses from Meta. Additionally, to make matters worse, this sluggishness also extends to severe cases of harmful content, such as death threats, incitements to violence, and other time-critical matters.
“Two months plus. And in our emails, we tell them that the situation is urgent and people are dying. The political situation is very sensitive, and it needs to be dealt with very urgently. And then it is months without an answer,” reads the report.
Disparities between responses based on the region
In addition to the widespread delay in responses, the report also highlights the regional discrepancies in Meta’s efforts to combat harmful content. For example, when compared to the Ukrainian partners who reported receiving responses within 72 hours, the Ethiopian partners, on the other hand, had to wait for several months just to hear the response related to the Tigray War. This delay also sadly contributed to the tragic passing of Tigrayan professor Meareg Amare, who suffered a racist assault on Facebook, where the platform allegedly failed to act promptly.
Moreover, this report echoes the disclosures from former Facebook employee Frances Haugen, whose internal documents uncovered a marked disregard for countries in South Asia and Africa, including Ethiopia.
“Trusted flagger programs are vital to user safety, but Meta’s partners are deeply frustrated with how the program has been run,” reads the report.