TikTok accused of promoting suicide amongst French youngsters
TikTok accused of promoting suicide amongst French youngsters
Occurred: November 2025
Page published: November 2025
Report incidentš„| Improve page š| Access database š¢
TikTok is under criminal investigation in France after multiple families and a major rights report accused its recommendation algorithm of steering vulnerable French teenagers toward content glorifying self-harm and suicide.Ā
Several French families and parliamentary committees initiated legal and criminal action against TikTok, alleging that its algorithm exposes young users, with some as young as 13, to harmful content promoting suicide, self-harm, and eating disorders.Ā
Two of the adolescents in these families (both aged 15) died by suicide.Ā
In November 2025, French prosecutors opened a criminal investigation into TikTok, looking at whether the platform promoted āpropaganda ⦠of methods ⦠used to commit suicide,ā and whether it failed in its duty to alert authorities about dangerous content.Ā
Separately, research by Amnesty International showing accounts started seeing āsadnessā videos just minutes after logging on; over time, the amount of depressive content more than doubled, and some content even included explicit references to suicide methods.Ā
TikTok strongly denies the allegations, stating it has āover 50 pre-set features ⦠to support the safety and well-being of teensā and claims it removes 90 percent of violative videos before they are viewed.
Critics argue that TikTokās algorithm may disproportionately recommend emotionally intense or ādarkā content to vulnerable users, thereby creating a feedback loop of despair.
According to Amnesty, TikTokās design amplifies risk rather than mitigates it: once a user shows interest in āsadā or depressing material, the algorithm intensifies similar content.Ā
Families and legal representatives also accuse TikTok of insufficient content moderation: despite reports from users, self-harm or suicidal content has allegedly remained accessible.Ā
There may also be accountability gaps: under-reporting of harmful content, lack of robust systems to detect suicidal ideation, and limited transparency on how the recommendation system works.
For those directly impacted: The allegations suggest that vulnerable teens may be drawn into dangerous content loops that worsen mental health, potentially contributing to self-harm or suicide. Families of victims are seeking legal redress, validation, and potentially compensation.
For broader society: This case spotlights a growing concern about how social media platforms influence youth mental health, especially when recommendation systems can ātrapā users in harmful content.
For regulation and accountability: The investigation could lead to stricter regulation in France (and possibly beyond) of how social media platforms moderate and recommend sensitive content, especially under the EUās Digital Services Act. It may also set a precedent for holding tech companies legally responsible for the mental health impacts of their algorithms.
For You
Developer: ByteDance/TikTok
Country: France
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Accountability; Safety; Transparency
Amnesty International. Dragged into the rabbit hole: New evidence of TikTokās risks to childrenās mental health
AIAAIC Repository ID: AIAAIC2130