Meta Lifts Ban on "Shaheed" Following Oversight Board Review
For years, Meta has faced criticism over its content policies related to the Middle East.
Meta Platforms announced that it will lift its ban on the word "shaheed," or "martyr" in English after a year-long review by its oversight board deemed the company's approach "overbroad."[1]
Read More: Meta Debuts New AI Chip
For years, Meta has faced criticism over its content policies related to the Middle East, particularly after a 2021 study it commissioned found that its practices had an "adverse human rights impact" on Palestinians and other Arabic-speaking users.
The oversight board, funded by Meta but operating independently, initiated the review last year due to "shaheed" being the cause of more content removals on Meta's platforms than any other single word or phrase.
In March, the review concluded that Meta's rules did not consider the various meanings of "shaheed" and led to removing content that did not praise violent actions.
Meta, the parent company of Facebook and Instagram, acknowledged the review's findings and stated that tests showed removing content when "shaheed" was paired with other violating content captured the most harmful content without disproportionately impacting user expression.
Recently, Meta introduced its first AI-driven ad targeting program for businesses on its widely used messaging application, WhatsApp.
Meta also announced a significant surge in young adult users on its flagship app, Facebook, marking the highest numbers in three years.