TikTok For You pushes suicide, violence, mysognism

Occurred: March 2023

Can you improve this page?
Share your insights with us

TikTok automatically showed violent, extremist, abusive, self-harm, and mysoginistic videos to users, raising doubts about the company's many stated commitments to provide a safe environment for its users, and its safety technologies and governance.

According to a report (pdf) by US corporate accountability group Eko, TikTok's For You recommendation algorithm automatically showed violence and self-harm videos to youngsters ten minutes after they started using the platform. Eko researchers also found that hashtags used on the site that included suicide content had garnered 8.8 billion views. 

The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them. 

A 2021 report (pdf) by the London-based think tank the Institute of Strategic Dialogue (ISD) found that anti-Asian and pro-Nazi videos were garnering millions of views on TikTok, often using pop songs to evade the platform's content moderation systems. 

Developer: ByteDance/TikTok
Operator: ByteDance/TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety; Ethics
Transparency: Governance; Black box

Page info
Type: Incident
Published: April 2023
Last updated: February 2024