Occurred: June 2021
Page published: November 2021 | Last updated: June 2025
Footage of a girl dancing spliced with a highly graphic and disturbing video of a man being beheaded by a group of men in a bathroom went viral on TikTok, raising concerns about the platform's safety and underlining the need for the effective regulation of algorithmic systems for minors.
A graphic video showing a person being beheaded went viral on TikTok after being uploaded by the user @mayengg03. The video was designed to "trick" the platform’s automated moderation systems: it began innocuously with a young girl in a black tank top dancing to a song by the rapper Doja Cat. Without warning, the footage spliced into a scene of Spanish-speaking men decapitating an individual in a bathroom.
The video circulated widely, appearing on the "For You" feeds of unsuspecting users across the globe. While the exact number of views is unconfirmed, the incident triggered a massive wave of community-led warnings as users attempted to shield others from the trauma. TikTok eventually removed the original video and added it to its "Hashbank" (a digital fingerprinting database) to block re-uploads, but not before the content had already inflicted significant psychological distress on a primarily young audience.
The video had previously been present on gore sites for two years, with reports at the time of its initial discovery in 2019 alleging the victim was a 19-year-old Mexican man.
The root cause of the incident appears to be limitations in TikTok's automated content moderation systems: AI filters are easily “tricked” when graphic content is encoded into a larger innocuous video, reducing the likelihood of detection before human review.
While TikTok employs thousands of human moderators, the sheer volume of uploads - combined with an AI system that prioritises engagement metrics - allowed the video to reach a "viral" threshold before human intervention occurred.
TikTok said it had "quickly" removed the offending item. However, TikTok users continued to say long after the event that they were too scared to use the app lest they saw the beheading.
For users, particularly minors, witnessing videos of this type can be acutely distressing, sometimes resulting in acute psychological trauma. Furthermore, the "bait-and-switch" nature of the video created a sense of "digital unsafety," where even seemingly benign content can harbour harmful imagery.
For society, the incident nighlighted the limitations of "automated safety" and sparked a heated conversation about the accountability of social media giants and the need for stricter "Online Safety" legislation. It demonstrated that as long as platforms prioritise algorithmic speed and engagement over rigorous human-in-the-loop moderation, "realised harms" such as the normalisation of extreme violence will continue to permeate public digital spaces.
TikTok For You
TikTok content moderation system
Developer: TikTok
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Moderate content; Recommend content
Technology: Content moderation system; Recommendation system; Machine learning
Issue: Accountability; Accuracy/reliability; Normalisation; Safety; Transparency
June 9–10, 2021: A spliced beheading video uploaded by user @mayengg03 bypasses TikTok's AI filters. It goes viral globally, appearing on the "For You" feeds of millions of users, many of whom are minors
June 11, 2021: TikTok bans the original account and issues a formal apology. The video is added to the "Hashbank" to automate the removal of re-uploads
December 2021: Former content moderator Candie Frazier files a class-action lawsuit, alleging she developed PTSD and anxiety from reviewing thousands of copies of the graphic footage due to TikTok's inadequate mental health safeguards
October 26, 2023: UK Online Safety Act Becomes Law The UK passes the Online Safety Act, specifically citing events like the 2021 beheading as reasons to impose a "Duty of Care" on social media platforms.
February 17, 2024: EU Digital Services Act (DSA) Fully Applies The DSA goes into full effect, requiring TikTok to mitigate systemic risks—including the spread of illegal or traumatic content—or face fines of up to 6% of global turnover.
May–July 2025: Compliance Deadlines Under the UK Online Safety Act, platforms must complete their first mandatory "Children's Risk Assessments" to prove they are preventing minors from seeing "illegal and harmful" content like the 2021 beheading video.
September–December 2025: Current Status TikTok continues to face a "Multidistrict Litigation" (MDL-3047) involving over 2,100 claims from families and states regarding the addictive and harmful nature of its algorithms. In December 2025, the EU issues a €120 million fine to "X" (formerly Twitter) under the DSA, signaling a new era of strict enforcement that TikTok must now navigate.
Frazier v. ByteDance
https://www.insider.com/tiktok-beheading-video-removing-from-platform-2021-6
https://www.newsweek.com/tiktok-graphic-beheading-video-company-response-1598107
https://news.yahoo.com/gory-beheading-video-inserted-notorious-145356485.html
https://www.dailydot.com/unclick/tiktok-beheading-little-girl-viral-video/
https://www.thedailybeast.com/tiktok-scrambles-to-stop-suicide-video-from-spreading-on-app
AIAAIC Repository ID: AIAAIC0653