Business

Warren presses Meta on content moderation policies amid Israel-Hamas war

Sen. Elizabeth Warren (D-Mass.) pressed Meta, the parent company of Facebook and Instagram, on Thursday for information about its moderation of Palestinian content amid the ongoing Israel-Hamas war. 

Dozens of human rights and civil society organizations have raised concerns about Meta’s “reported suppression, filtering, and mistranslation of Palestine-related content over the past two months,” Warren noted in a letter to Meta CEO Mark Zuckerberg.

“Amidst the horrific Hamas terrorist attacks in Israel, a humanitarian catastrophe including the deaths of thousands of civilians in Gaza, and the killing of dozens of journalists, it is more important than ever that social media platforms do not censor truthful and legitimate content,” the senator said.

Since Hamas launched a surprise attack on Israel on Oct. 7 — and Israel responded with a bombing campaign and ground invasion of Gaza — Meta has reportedly applied stricter filters to content coming from the Palestinian territories.

The Wall Street Journal reported in late October that the social media company had lowered the threshold for an automated system that typically hides comments when it is 80 percent certain they violate Meta’s policies to only require 25 percent certainty to automatically hide comments.


Instagram also reportedly flagged and hid comments containing the Palestinian flag emoji as “potentially offensive,” and Meta had to apologize after Instagram’s auto-translation inserted the word “terrorist” into the bios of some Palestinian users containing an Arabic phrase.

“Reports of Meta’s suppression of Palestinian voices raise serious questions about Meta’s content moderation practices and anti-discrimination protections,” Warren said in Thursday’s letter. 

“Social media users deserve to know when and why their accounts and posts are restricted, particularly on the largest platforms where vital information-sharing occurs,” she added. “Users also deserve protection against discrimination based on their national origin, religion, and other protected characteristics.”

Meta has previously faced allegations of unfairly moderating Palestinian speech on its platforms.

After an outbreak of violence in May 2021 prompted similar accusations, the company commissioned an independent due diligence report that found its actions “had an adverse human rights impact on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination.”