Chase Nasca takes own life after being swamped with TikTok suicide videos
Chase Nasca takes own life after being swamped with TikTok suicide videos
Occurred: February 2022
Report incident 🔥 | Improve page 💁 | Access database 🔢
A 16-year-old from Long Island, USA, died by suicide after allegedly being inundated with suicide-themed videos on TikTok, prompting his family to sue the platform for negligence and product liability.
Chase Nasca, a high school junior and honours student, died by suicide after walking in front of a train near his home.
In a lawsuit, his parents allege that TikTok’s algorithm pushed him over 1,000 unsolicited and increasingly extreme videos promoting self-harm and suicide, including content specifically themed around railway suicides, which they claim exploited his proximity to train tracks.
Court documents state that TikTok used geolocation data to serve Chase "railroad-themed suicide videos," and that despite his searches for motivational content, he was instead exposed to thousands of harmful clips.
The incident had devastating consequences for Chase's family, and highlighted broader concerns about the mental health risks posed to young users by algorithm-driven social media feeds.
According to the lawsuit, TikTok’s design and recommendation algorithm prioritized engagement by promoting content that users were likely to watch repeatedly, regardless of its potential harm.
Chase’s parents argue that the app’s algorithm exploited his vulnerabilities and emotional state, leading to a "progression" of extreme content that contributed to his mental health decline and eventual suicide.
They claim that TikTok failed to monitor and remove dangerous content, and that the platform’s use of location data further targeted Chase with videos related to railway suicide, a method he ultimately used.
TikTok, in its legal defense, has argued that it is protected from liability under Section 230 and the First Amendment, maintaining that it cannot be held responsible for third-party content or users’ actions.
Chase Nasca’s case has become emblematic of growing concerns about the mental health impact of algorithm-driven platforms on teenagers, especially regarding exposure to self-harm and suicide-related material.
The lawsuit and its publicity have intensified public and legislative scrutiny of TikTok and similar platforms, fueling debates over tech regulation, youth safety online, and the responsibilities of social media companies.
Recommender system
A recommender system (RecSys), or a recommendation system (sometimes replacing system with terms such as platform, engine, or algorithm), sometimes only called "the algorithm" or "algorithm"[1] is a subclass of information filtering system that provides suggestions for items that are most pertinent to a particular user.
Source: Wikipedia 🔗
February 2022. Chase Nasca, a 16-year-old high school student from Bayport, Long Island, dies by suicide after walking in front of a Long Island Rail Road train near his home.
March 2023. Chase Nasca’s parents file a wrongful death lawsuit against ByteDance, Inc. and TikTok, Inc. in the Supreme Court of the State of New York, Suffolk County.
April 2025. The court rules in favour of TikTok, dismissing the lawsuit. The court finds that the claims are fundamentally based on TikTok’s role as a publisher of third-party content and are therefore barred by Section 230 protections.
For You 🔗
Operator:
Developer: TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm; Machine learning
Issue: Accountability; Business model; Liability; Safety; Transparency
Nasca v. ByteDance, Ltd.
Page info
Type: Incident
Published: April 2025