TikTok For You pushes suicide, violence, mysognism
Occurred: March 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
TikTok automatically showed violent, extremist, abusive, self-harm, and mysoginistic videos to users, raising doubts about the company's many stated commitments to provide a safe environment for its users, and its safety technologies and governance.
According to a report (pdf) by US corporate accountability group Eko, TikTok's For You recommendation algorithm automatically showed violence and self-harm videos to youngsters ten minutes after they started using the platform. Eko researchers also found that hashtags used on the site that included suicide content had garnered 8.8 billion views.
The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them.
A 2021 report (pdf) by the London-based think tank the Institute of Strategic Dialogue (ISD) found that anti-Asian and pro-Nazi videos were garnering millions of views on TikTok, often using pop songs to evade the platform's content moderation systems.
System 🤖
TikTok For You recommendation algorithm
Developer: ByteDance/TikTok
Operator: ByteDance/TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety; Ethics
Transparency: Governance; Black box
Research, advocacy 🧮
EKO (2023). Suicide, Incels, and Drugs: How TikTok’s deadly algorithm harms kids (pdf)
ISD (2021). Hatescape: An In-Depth Analysis of Extremism and Hate Speech on TikTok (pdf)