Pressure increases for Meta to overhaul African content moderation


As Meta's main third-party content moderator in Africa pulls out, rights groups are calling for the company to overhaul its practices.

Meta's content moderation practices in Africa are coming under increasing fire. A recent lawsuit claims that the company has effectively fuelled violence in Ethiopia’s civil war, and one of its main contractors in the region has withdrawn its moderation services.

Launched in December, the lawsuit alleges that the company has failed to employ enough content moderation staff speaking local languages and has amplified hateful content.

ADVERTISEMENT

"The spread of dangerous content on Facebook lies at the heart of Meta’s pursuit of profit, as its systems are designed to keep people engaged," says Flavia Mwangovya, Amnesty International’s deputy regional director of East Africa, Horn, and Great Lakes Region.

"This legal action is a significant step in holding Meta to account for its harmful business model."

The case is being brought by Kenyan rights group Katiba Institute, and Ethiopian researchers Fisseha Tekle and Abrham Meareg, whose father Professor Meareg Amare was accused in a series of Facebook posts of having stolen equipment from Ethiopia's Bahir Dar University, where he worked.

Some of the posts gave the neighborhood where he lived and called for his death ⁠— after which he was shot outside his home and left to bleed to death.

The complainants are calling on Meta to start demoting violent incitement in a similar way to its actions following the US Capitol riots of 6 January 2021.

They also want the company to employ more content moderators, in particular people who can handle posts in the minority local languages, and to create a $2.4 billion restitution fund for victims of hate and violence.

Contractor pulls out

Now, though, Meta's content moderation in Africa is becoming even more mired in controversy, with the news that its main contractor in the region, Nairobi-based Sama, is pulling out of the arrangement and cutting around 200 staff.

ADVERTISEMENT

While the company was originally hired to carry out data labeling, its work quickly expanded to include the moderation of extremely graphic violence, including beheadings and child abuse.

It, too, is the subject of a lawsuit (along with Meta), with claims that it has been subjecting staff to 'forced labor and human trafficking for labor'. It's also been accused of hiring people as 'call center staff' and the like without making clear what the work actually involved.

Now, Sama's departure from the scene is prompting rights groups to renew their calls for Meta to radically overhaul its content management practices.

"Meta should increase the number of moderators for the region to adequately cover local languages and dialects, and also be more transparent about their algorithms which are promoting harmful content," says Access Now’s Bridget Andere.

Not the first time

Meta's come under fire for inadequate content moderation before. Late in 2021, Rohingya refugees from Myanmar filed a lawsuit against the company, alleging that it failed to take action against anti-Rohingya hate speech that fuelled violence - with the company acknowledging that it failed to do enough.

And last October, an investigation by rights group Global Witness found that, in the run-up to the Brazilian presidential election, 100% of ads submitted for approval to Facebook were approved for publication, despite including incorrect information such as the date of voting, denying the credibility of the election and urging people not to vote.

"This key vote in Brazil has been marred by a huge spike in political violence, killings, threats and kidnappings. It’s a sad reality that this tense environment has been fuelled online," commented Jon Lloyd, senior adviser at Global Witness. "The issues raised here are not simply what could or might be happening ⁠— it is happening."

Meta is of course, defending itself against accusations that it's failing to adequately moderate content in Africa.

"Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions," says a spokesperson.

ADVERTISEMENT

"We employ staff with local knowledge and expertise, and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya."

However, it's clear that at times of conflict or uncertainty, more needs to be done. While issues around minority languages have clearly played a part in Meta's level of content moderation in Ethiopia, Global Witness says the company's 'extremely poor' at moderating even content in the country's main language.

The departure of Sama gives Meta an easy opportunity to overhaul its content moderation practices in the region, with Cori Crider, director of Foxglove, which is supporting the legal action, telling Reuters that Facebook should 'take moderation in-house, hire every one of the 260 content moderators in Nairobi who do vital safety work, and value these people with decent wages, full clinical mental health support and dignity and prestige'.

Meta has so far simply commented that, following Sama's departure, it aims to 'ensure there's no impact on our ability to review content'.

However, this will, of course, be easier said than done. Given the working conditions that Sama staff have complained about, it may be hard to tempt qualified workers to sign up. Doing so may mean Meta has to pay them a great deal more ⁠— and so it should.