TikTok For You pushes suicide, violence, mysognism
Occurred: March 2023
Research reports show that TikTok has been automatically showing violent, extremist, abusive, self-harm, and mysoginistic videos to users, raising major doubts about the company's many stated commitments to provide a safe environment for its users, and its safety technologies and governance.
According to a report (pdf) by US corporate accountability group Eko, TikTok's For You recommendation algorithm automatically showed violence and self-harm videos to youngsters ten minutes after they started using the platform. Eko researchers also found that hashtags used on the site that included suicide content had garnered 8.8 billion views.
The report was published during a congressional hearing in which TikTok CEO Shou Zi Chew was accused of allowing harmful content to be served to young users, and inflicting 'emotional distress' on them.
A 2021 report (pdf) by the London-based think tank the Institute of Strategic Dialogue (ISD) found that anti-Asian and pro-Nazi videos were garnering millions of views on TikTok, often using pop songs to evade the platform's content moderation systems.
Purpose: Recommend content
Technology: Recommendation algorithm
Issue: Safety; Ethics
Transparency: Governance; Black box
News, commentary, analysis
Published: April 2023
Last updated: June 2023