TikTok accused of promoting suicide amongst French youngsters
TikTok accused of promoting suicide amongst French youngsters
Occurred: November 2025
Page published: November 2025
TikTok is under criminal investigation in France after multiple families and a major rights report accused its recommendation algorithm of steering vulnerable French teenagers toward content glorifying self-harm and suicide.
Several French families and parliamentary committees initiated legal and criminal action against TikTok, alleging that its algorithm exposes young users, with some as young as 13, to harmful content promoting suicide, self-harm, and eating disorders.
Two of the adolescents in these families (both aged 15) died by suicide.
In November 2025, French prosecutors opened a criminal investigation into TikTok, looking at whether the platform promoted “propaganda … of methods … used to commit suicide,” and whether it failed in its duty to alert authorities about dangerous content.
Separately, research by Amnesty International showing accounts started seeing “sadness” videos just minutes after logging on; over time, the amount of depressive content more than doubled, and some content even included explicit references to suicide methods.
TikTok strongly denies the allegations, stating it has “over 50 pre-set features … to support the safety and well-being of teens” and claims it removes 90 percent of violative videos before they are viewed.
Critics argue that TikTok’s algorithm may disproportionately recommend emotionally intense or “dark” content to vulnerable users, thereby creating a feedback loop of despair.
According to Amnesty, TikTok’s design amplifies risk rather than mitigates it: once a user shows interest in “sad” or depressing material, the algorithm intensifies similar content.
Families and legal representatives also accuse TikTok of insufficient content moderation: despite reports from users, self-harm or suicidal content has allegedly remained accessible.
There may also be accountability gaps: under-reporting of harmful content, lack of robust systems to detect suicidal ideation, and limited transparency on how the recommendation system works.
For those directly impacted: The allegations suggest that vulnerable teens may be drawn into dangerous content loops that worsen mental health, potentially contributing to self-harm or suicide. Families of victims are seeking legal redress, validation, and potentially compensation.
For broader society: This case spotlights a growing concern about how social media platforms influence youth mental health, especially when recommendation systems can “trap” users in harmful content.
For regulation and accountability: The investigation could lead to stricter regulation in France (and possibly beyond) of how social media platforms moderate and recommend sensitive content, especially under the EU’s Digital Services Act. It may also set a precedent for holding tech companies legally responsible for the mental health impacts of their algorithms.
For You 🔗
Developer: ByteDance/TikTok
Country: France
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Accountability; Safety; Transparency
EU Digital Services Act
Amnesty International. Dragged into the rabbit hole: New evidence of TikTok’s risks to children’s mental health
AIAAIC Repository ID: AIAAIC2130