50 Melbourne school girls targeted using AI nude images

Occurred: June 2024

Fifty female students at Bacchus Marsh Grammar school in Melbourne, Australia, were targeted with AI-generated fake nude images.

The images which were shared on social media, appeared to have been created using AI to graft photos of the girls' faces obtained from their private social media accounts onto others' bodies. 

The mother of one of the targeted students shared that her 16-year-old daughter vomited after seeing the "incredibly graphic" and "mutilated" images online. The school said it was working with police to remove the images from social media and determine if the perpetrator is a student or someone else.

A teenage male was subsequently arrested in relation to the incident and released pending further inquiries. 

The school said it was offering support to the affected students, who are in years 9 to 12 (approximately ages 14-18), and their families. The school's principal described the incident as "appalling" and stated that the girls should be able to learn without facing such "nonsense".

The incident drew widespread criticism, with many expressing concern over the safety and well-being of young girls in schools and online communities, and the lack of adequate regulation on non-consensual deepfake pornographic images in Australia. 

System 🤖

Operator:
Developer:  
Country: Australia
Sector: Education
Purpose:  
Technology: Deepfake - image
Issue: Safety
Transparency: Governance; Marketing