TikTok pushes suicide content to kids

Occurred: March 2023

Can you improve this page?
Share your insights with us

TikTok's For You recomemdation algorithm automatically shows violence and self-harm videos to youngsters ten minutes after they start using the platform, according to a report (pdf) by US corporate accountability group Eko. 

TikTok has regularly said it would crackdown on extremist, violent and unsafe content. However, Eko researchers also discovered that hashtags used on TikTok that included suicide content had been included in over a million posts and garnered 8.8 billion views. 

The findings led digital rights and mental health campaigners to complain the platform is unsafe and harmful to young people. The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them. 

Developer: ByteDance/TikTok
Operator: ByteDance/TikTok
Country: USA
Sector: Technology
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety; Ethics
Transparency: Governance; Black box