TikTok For You pushes suicide, violence, misognysm

Occurred: March 2023

TikTok automatically showed violent, extremist, abusive, self-harm, and misogynistic videos to users, raising doubts about the company's stated commitments to provide a safe environment for its users.

According to a report (pdf) by US corporate accountability group Eko, TikTok's For You recommendation algorithm automatically showed violence and self-harm videos to youngsters ten minutes after they started using the platform. 

Eko researchers also found that hashtags used on the site that included suicide content had garnered 8.8 billion views. 

The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them.

System 🤖

Developer: ByteDance/TikTok
Operator: ByteDance/TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety

Page info
Type: Incident
Published: April 2023
Last updated: February 2024