TikTok risks pushing kids towards harmful mental health content

Occurred: November 2023

Can you improve this page?
Share your insights with us

Tiktok's business model is 'inherently abusive' and 'poses a danger' to children, according to researchers.

An investigation by Amnesty International, Algorithmic Transparency Institute, and AI Forensics, concluded that children and young people watching mental health-related content on TikTok's personalised ‘For You’ page were drawn into 'rabbit holes' of potentially harmful content, including videos that romanticise and encourage depressive thinking, self-harm and suicide.

Using automated accounts set up to represent users in the USA and Kenya, the researchers discovered that after 5-6 hours on the TikTok platform, almost 1 in 2 videos shown were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.

Furthermore, when researchers manually rewatched mental health-related videos, over half the videos were related to mental health struggles, including videos encouraging suicide.

Databank

Operator:  
Developer: Bytedance/Tiktok
Country: Kenya; Philippines; USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety
Transparency: Governance