TikTok For You pushes suicide, violence, misognysm
Occurred: March 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
TikTok automatically showed violent, extremist, abusive, self-harm, and misogynistic videos to users, raising doubts about the company's stated commitments to provide a safe environment for its users.
According to a report (pdf) by US corporate accountability group Eko, TikTok's For You recommendation algorithm automatically showed violence and self-harm videos to youngsters ten minutes after they started using the platform.
Eko researchers also found that hashtags used on the site that included suicide content had garnered 8.8 billion views.
The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them.
System 🤖
TikTok For You recommendation algorithm
Developer: ByteDance/TikTok
Operator: ByteDance/TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety
Research, advocacy 🧮
EKO (2023). Suicide, Incels, and Drugs: How TikTok’s deadly algorithm harms kids (pdf)
ISD (2021). Hatescape: An In-Depth Analysis of Extremism and Hate Speech on TikTok (pdf)
Page info
Type: Incident
Published: April 2023
Last updated: February 2024