Facebook accused of inciting violence against Muslims in Sri Lanka
Facebook accused of inciting violence against Muslims in Sri Lanka
Occurred: February 2018-
Page published: September 2024
Facebook’s failure to moderate Sinhala-language hate speech and misinformation in 2018 enabled the spread of anti-Muslim rhetoric, which fueled communal riots that resulted in deaths, property destruction, and a national state of emergency in Sri Lanka.
Fueled by misinformation spread on social media, including a viral video falsely alleging that a Muslim restaurateur had poisoned Sinhala-Buddhist customers, riots started in Ampara, Sri Lanka, and quickly spread.
At least three people were killed and many businesses and mosques were destroyed in the violence that ensued. In addition, the riots negatively impacted Sri Lanka's economy, with many tourists cancelling visits to the heavily tourist-dependent country.
The Sri Lankan government responded by temporarily banning Facebook and other social media platforms to prevent further violence.
The root cause was a catastrophic lack of local moderation resources; at the time of the riots, Facebook reportedly had only two Sinhala-speaking content reviewers for millions of users.
This lack of linguistic and cultural context meant that automated filters and human moderators failed to recognize slurs and coded incitement.
Corporate accountability was limited by a "reactive" rather than "proactive" approach to human rights in smaller markets, ignoring years of warnings from local activists about rising ethno-nationalist content.
The company later admitted that its platform had been misused and recognised the human rights impacts resulting from its inaction.
The controversy surrounding Facebook's role in the riots highlighted the broader issue of social media's impact on communal tensions and violence, particularly against vulnerable groups.
Following the incident, Facebook committed to improving its content moderation practices, including hiring local language moderators and enhancing technology to detect hate speech.
Facebook content moderation system
Developer: Facebook
Country: Sri Lanka
Sector: Religion; Politics
Purpose: Moderate content
Technology: Content moderation system; Machine learning
Issue: Accountability; Fairness; Human/civil rights; Mis/disinformation; Safety; Transparency
Facebook. Sri Lanka Human Rights Impact Assessment
February 2018. Misinformation begins spreading about "infertility pills" in Muslim restaurants in Ampara.
March 5, 2018. Violence erupts in Kandy following the death of a Sinhalese man; extremist pages call for retaliatory attacks.
March 6-7, 2018. The Sri Lankan government declares a state of emergency and blocks Facebook, WhatsApp, and Instagram.
March 14, 2018. Sri Lankan officials meet with Facebook executives to demand faster takedown systems.
November 2018. Facebook commissions an independent human rights impact assessment.
May 2020. Facebook apologised for its role in the anti-Muslim riots in Sri Lanka in 2018 after an investigation by human rights advisors Article One concluded the social media company had helped instigate the violence.
Muslim Advocates. Complicit - The human cost of Facebook's disregard for human life
AIAAIC Repository ID: AIAAIC0126