Occurred: November 2021
Report incident 🔥 | Improve page 💁 | Access database 🔢
The doxxing and murder of chemistry professor and Tigrayan ethnic group member Professor Meareg Amare Abrha sparked outrage and triggered a USD 2 billion lawsuit aginst Facebook owner Meta Platforms.
Abrha was assassinated outside his family home in Bahir Dar, the capital of Ethiopia’s Amhara regional state, by a group of armed men who had followed him home from his university on motorbikes and shot him at close range trying to enter his family home.
His murder came after he and Tigrayan colleagues at the university had been targeted on Facebook, and shortly after details of his ethnicity, workplace and residential details were shared and calls for his death posted on a Facebook page named 'BDU STAFF'.
Despite his son reporting the posts, one remained online for over a year.
The perpetrators actively prevented bystanders from administering aid or transporting him to a nearby hospital, exacerbating his injuries and ensuring his death. His body remained unattended for seven hours before municipal workers buried him in an unmarked grave.
The posts targeting Professor Abrha were amplified by Facebook's algorithmic systems, which prioritise engagement-driven content without sufficient safeguards against harmful material.
Internal documents reveal that Facebook’s "network-based", AI-powered moderation systems, which the company relied on in Ethiopia due to its inadequate language capacity to perform meaningful human moderation, were insufficiently equipped to handle content in languages like Amharic, Oromo and Tigrinya.
Furthermore, Meta was accused of neglecting safety measures for users in regions like Africa while focusing resources on Western markets, notably the United States - its largest and most profitable market.
The result: the spread of content advocating hatred and violence against Tigrayans during the November 2020 to November 2022 armed conflict in northern Ethiopia, the harassment of Tigrayans across the country, and Professor Abrha's death.
The incident raises serious and urgent questions about the dangers posed by algorithms that prioritise virality over safety.
It also highlights the need for so-called "big tech" companies to put a greater focus on content moderation in the so-called "Global south" and in conflict zones.
October 2021. 'Facebook papers' leaked by whistleblower Frances Haugen and shared with the US SEC revealed Facebook knew it was being used to incite violence in Ethiopia but did little to stop its spread.
November 2021. Professor Meareg is murdered. A week after his murder, Facebook announced a series of measures intended to address abusive and violent material on its platform in Ethiopia, included reducing the spread of material the company’s automated moderation technology had flagged as being likely to be hate speech.
December 2022. Professor Abrha's son Abrham, alongside the Katiba Institute and former Amnesty International researcher Fisseha Tekle, filed a GBP 2 billion class-action lawsuit against Facebook owner Meta alleging that Facebook's content moderation was 'woefully inadequate', and that Facebook's algorithm helped fuel the viral spread of hate and violence during Ethiopia's civil war. The lawsuit (or 'petition'), which was filed by London-based legal non-profit Foxglove in Nairobi, Kenya, where Facebook opened a major content moderation hub for Eastern and Southern Africa in 2019, accused Meta of having too few moderators who deal with posts in the Amharic, Oromo, and Tigrinya languages. The suit went on to argue that Facebook's algorithm promoted 'hateful and inciting' content as it is likely to draw more interaction from users.
April 2023. A Kenyan court granted Abrham and other petitioners leave to sue Meta in California, USA, after they failed to identify the social media company's physical office in the country.
April 2025. The lawsuit was given permission to go ahead in Nairobi, Kenya.
Facebook content moderation system
Meta/Facebook (2021). An Update on Our Longstanding Work to Protect People in Ethiopia
Operator: Facebook
Developer: Facebook
Country: Ethiopia
Sector: Education
Purpose: Moderate content
Technology: Content moderation system
Issue: Accountability; Bias/discrimination; Business model; Human/civil rights; Safety; Transparency
Amnesty (2023). Ethiopia: ‘A death sentence for my father’: Meta’s contribution to human rights abuses in northern Ethiopia
Global Witness (2022). ‘Now is the time to kill’: Facebook continues to approve hate speech inciting violence and genocide during civil war in Ethiopia
The Bureau of Investigative Journalism (2022). Facebook accused by survivors of letting activists incite ethnic massacres with hate and misinformation in Ethiopia
Facebook Oversight Board (2021). Oversight Board upholds Meta's original decision: Case 2021-014-FB-UA
Page info
Type: Incident
Published: February 2023
Last updated: April 2025