TikTok hate speech detection system accused of racial bias

Occurred: July 2021

TikTok's hate speech detection system was accused of racial bias by Black creators and rights activists for appearing to block Black content.

Per The Verge, the furore was sparked by Black influencer Ziggi Tyler finding that phrases including 'Black Lives Matter' and 'Black success' were flagged by TikTok as 'inappropriate' when he tried to update his creator profile. 

At the same time, TikTok's system allowed phrases such as 'white supremacy' and 'white success'. 

TikTok responded by blaming its content moderation technology and claimed it had fixed the error.

The incident raised questions about TikTok's  willingness and ability to detect and act upo

Operator: ByteDance/TikTok
Developer: ByteDance/TikTok
Country: USA; Global
Sector: Media/entertainment/sports/arts
Purpose: Detect hate speech
Technology: Recommendation algorithm
Issue: Bias/discrimination - race
Transparency: Black box; Governance

Page info
Type: Incident
Published: December 2021