X/Twitter fails to remove graphic AI images of Taylor Swift

Occurred: January 2024

Can you improve this page?
Share your insights with us

Sexually explicit AI-generated images of Taylor Swift published on Twitter and which went viral remained on the platform for up to 17 hours before they were removed.

The images, which showed Swift in a series of sexual acts while dressed in Kansas City Chief memorabilia, were uploaded to deepfake porn website Celeb Jihad and quickly went viral on X/Twitter, Facebook, Instagram, Reddit, and other platforms. The images appeared also to have been shared on a Telegram group dedicated to abusive images of women, and created using Microsoft Designer, according to 404 Media.

X/Twitter eventually removed offending images, shut down the account that first shared them, and suspended accounts that had re-shared them. However, other images quickly emerged in their place. Later, it blocked searches for Swift's name. 

The incident led Swift to say she was considering legal action against Celeb Jihad. It also raised questions about X's business model and the effectiveness of it's content moderation system, which is mostly automated after Elon Musk had fired much of its safety team earlier in 2023. 

It was also seen to demonstrate the ease with which synthetic images can be made and distributed, and renewed calls for effective legislation in the US. 

Databank

Operator:  
Developer: X/Twitter
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Moderate content
Technology: Content moderation system; Machine learning
Issue: Business model; Privacy; Robustness; Safety
Transparency