Ghana moderators sue Meta over impact of extreme content
Ghana moderators sue Meta over impact of extreme content
Occurred: April 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
Content moderators working for Meta in Ghana are suing the technology company, alleging severe psychological harm from exposure to graphic material while reviewing social media posts.
Content moderators in Accra, Ghana, who work for Majorel - a company owned by French company Teleperformance and contracted by Facebook parent company Meta - have filed a lawsuit against Meta, claiming they have suffered significant psychological distress from repeatedly viewing and removing disturbing and violent content on Meta’s platforms.
The moderators report suffering from depression, anxiety, insomnia, and substance abuse as a direct result of their work conditions. One worker reported attempting to commit suicide.
They also allege that the mental health support provided by their employer was inadequate and that their requests for help were often ignored.
The lawsuit follows similar complaints and legal actions in Kenya, where over 100 Facebook content moderators were diagnosed with severe post-traumatic stress disorder (PTSD) after being exposed to graphic content.
After Meta laid off its entire moderation workforce in Kenya, operations were moved to Ghana, where workers now report facing similarly exploitative and unsafe conditions.
The root cause of the lawsuit is the nature of content moderation work, which requires moderators to review large volumes of graphic, violent, and disturbing material to enforce Meta’s community standards at high speed with few breaks and little support.
Workers allege that Meta, through its contractor Majorel, failed to provide adequate mental health care, imposed grueling performance targets, and maintained poor working and living conditions-including surveillance, low pay, and minimal breaks after viewing traumatic content.
The use of outsourcing and opaque employment practices has allowed Meta to distance itself from direct responsibility, despite setting the policies and targets that shape the moderators’ daily work
The case highlights ongoing concerns about the treatment of outsourced tech workers in the Global South, where labour costs are lower and protections are weaker.
It raises questions about the ethical responsibilities of Big Tech companies like Meta in safeguarding the well-being of workers who perform socially important but psychologically highly hazardous tasks.
Facebook content moderation system
Instagram content moderation system
Operator:
Developer: Meta
Country: Ghana
Sector: Multiple
Purpose: Moderate content
Technology: Content moderation system
Issue: Accountability; Employment; Liability; Safety; Transparency
The Bureau of Investigative Journalism. Suicide attempts, sackings and a vow of silence: Meta moderators face worst conditions yet
https://www.primenewsghana.com/tech/meta-faces-ghana-lawsuits-over-impact-of-extreme-content-on-moderators.html
https://ghananewsonline.com.gh/meta-faces-lawsuit-in-ghana-over-impact-of-extreme-content-on-moderators/
Page info
Type: Incident
Published: April 2025