Meta automated moderation wrongly removes Israel-Hamas videos

Occurred: October-December 2023

Can you improve this page?
Share your insights with us

Meta’s automated content moderation system unfairly removed videos depicting hostages, injured civilians, and possible casualties in the Israel-Hamas war from Facebook and Instagram, drawing criticism from its Oversight Board and others.

In one instance, a video depicted an Israeli woman pleading with kidnappers not to kill her during the October 7 attack on Israel by Hamas. In another, a video posted to Instagram showed what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during Israel's ground offensive in the north of the Gaza Strip, including killed or injured Palestinians, including children. 

In both cases, the videos were automatically removed and later reinstated. Meta's independent Oversight Board ruled that the videos should not have been removed, and found that the company had lowered its content moderation thresholds to more easily catch violating content following the attack on October 7, a decision that 'also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.'

The Board also argued that inadequate human-led moderation, especially in non-English languages, during these types of crises could lead to the 'incorrect removal of speech that may be of significant public interest' and that Meta should have been swifter to allow content 'shared for the purposes of condemning, awareness-raising, news reporting or calling for release' with a warning screen applied.


Developer: Meta
Country: Israel; Palestine
Sector: Politics
Purpose: Detect & remove content violations
Technology: Content moderation system; Machine learning
Issue: Governance; Human/civil rights;
Transparency: Governance