Deepfake nudes target Winnipeg school female students
Deepfake nudes target Winnipeg school female students
Occurred: December 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Deepfake nudes targeting young female students at Collège Béliveau in Winnipeg, Canada, caused serious distress and raised concerns about the misuse of AI.
Doctored photos created using publicly accessible social media images of the students were altered using AI to produce highly explicit images and shared online, prompting students to report the incident to school officials.
Approximately 300 fake nude images were believed to have been created, affecting dozens of girls, many of whom were reported to have suffered serious emotional distress, feelings of violation, shame, fear of others, and anger.
Winnipeg police investigated the incident but laid no charges.
The incident was made possible due to the increasing accessibility and advancement of AI image generation tools, which have made it possible to produce highly realistic deepfakes with minimal effort.
Furthernore, some of these tools are expressly designed to facilitate this kind of behaviour, while others have few or no guardrails to limit how they can be used.
The perpetrators may have used one of these tools without fully understanding the gravity of their actions or the legal implications.
The incident highlights the growing challenge of protecting individuals, especially young girls, from abuse by people using AI.
It also underscores the need for fit-for-purpose legislation to address AI-generated explicit content, particularly involving minors, and improved digital literacy education in schools and amongst parents,
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
Unknown
Operator:
Developer:
Country: Canada
Sector: Education
Purpose: Harrass/humiliate
Technology: Deepfake - audio, video
Issue: Impersonation; Safety
Page info
Type: Incident
Published: February 2025