Teen distributes AI-generated nude pictures of Issaquah students

Occurred: October 2023-

Can you improve this page?
Share your insights with us

A teenage boy used AI to generate nude images of his female classmates and a member of staff at Issaquah High School, Seattle, and sent them round the school.

The images were created with an unnamed web-based nudification app, which automatically edits photos of women to make them appear naked. A student reportedly discovered the app on TikTok and then posted some of nudified photographs on Snapchat or showed them to other students over lunch at the school.  

The school referred the incident to the local police force, which launched an investigation. Media reports later indicated that no charges had been levelled against the perpetrator. 

The incident was seen to highlight the ease with which harmful deepfake images can be made and circulated, and the lack of local or federal US laws directly addressing the creation and distribution of deepfake images intended to harass or otherwise harm other people. 

Databank

Operator: Issaquah High School students
Developer: 
Country: USA
Sector: Education
Purpose: Harrass/intimidate/shame
Technology: Deepfake - image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Accountability; Safety
Transparency

Page info
Type: Incident
Published: January 2024
Last updated: February 2024