Meta announced it would formally lift a years-long ban on the Arabic word "shaheed," after the platforms Oversight Board handed down a decision stating the social media company was in effect censoring millions of people by restricting the word.
Translated in English to “martyr,” Meta had first enacted a blanket ban on the word shaheed in 2021, over concerns that its use would be interpreted as glorifying violence under its Dangerous Organizations and Individuals (DOI) policy.
The restrictions upset many in the Arabic-speaking communities and even Meta found after conducting its own study after implementing the ban at the time that it had an "adverse human rights impact" on Palestinians and other Arabic-speaking users.
The independent Oversight Board, although funded by Meta, completed the year-long assessment on the platforms' restrictions of the word, publishing a 37-page Policy Advisory Opinion report on its findings Tuesday.
According to the Board, Meta’s “overly broad” restrictions of the word, have likely led to significantly more content removals under Community Standards than any other single word or phrase on its platforms since it was enacted.
“It is imperative Meta take effective action to ensure its platforms are not used to incite acts of violence, or to recruit people to engage in them,” the Board stated in its executive summary.
However, the Board also stated that “Meta’s response to this threat must also be guided by respect for all human rights, including freedom of expression.”
Since Israel declared war on Hamas following the October 7th attacks, the ban has only become more of a sticking point.
Meta had requested the Board review the policy last February over the censorship concerns, and was advised by the Board this March to change its policy and allow the use of "shaheed" in alternate contexts that clearly do not promote violence.
Meta posted a response on the Board's advisory stating that it would continue “to remove content when ‘Shaheed’ is paired with otherwise violating content,” which included the Board’s designated “three signals of violence.”
Meta, which is the parent company of Facebook and Instagram, said its revised policy would capture “the most potentially harmful content without disproportionality impacting voice.”
The Oversight Board praised Meta’s commitment to revising its policy.
Your email address will not be published. Required fields are markedmarked