Miami boys arrested for creating and sharing nude images of students

Occurred: March 2024

Can you improve this page?
Share your insights with us

Two students at Pinecrest Cove Academy, Miami, used AI to make disturbing nude images of their classmates, prompting outrage and leading to their suspension and arrest.

According to reports, the 13 and 14-year-old perpetrators used headshots of male and female students obtained from the school’s social media account and used an AI deepfake app to create the nude images. The AI-generated images were then shared among students on social media. 

The images were said to humiliate the victims, make them feel violated, and cause mental instability. Several students did not want to return to class in the days following the discovery.

The culprits were initially suspended for 10 days and then charged with third-degree felonies under a 2022 Florida law that criminalises the dissemination of deepfake sexually explicit images without the victim’s consent. The incident was thought to be the first instance of criminal charges related to AI-generated nude images in the US. 

Databank

Operator: Pinecrest Cove Academy students  
Developer:  
Country: USA
Sector: Education
Purpose: Nudify women, men
Technology: Deepfake - image; Neural network; Deep learning; Machine learning
Issue: Ethics/values; Privacy; Safety
Transparency: Governance; Marketing