Students create deepfake nudes of St Thomas Aquinas Catholic Secondary School classmates
Students create deepfake nudes of St Thomas Aquinas Catholic Secondary School classmates
Occurred: March 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Teen boys at St. Thomas Aquinas Catholic Secondary School in London, Ontario, used AI to create and share fake nude images of female classmates, triggering investigations but no criminal charges.
Students copied social media photos of peers and altered them using AI tools to generate explicit imagery, which circulated through group chats.
The incident caused humiliation and distress among targeted students, with one victim describing lasting embarrassment for herself and her family.
School administrators warned of disciplinary measures for creators and distributors, while police investigated without pursuing charges.
Experts attribute the behaviour to a lack of awareness about AI's harmful potential and insufficient education on digital ethics.
Students called for stronger guidance on responsible AI use, as current school discussions focus more on academic cheating than image-based abuse.
Sociologist Kaitlyn Mendes highlighted gaps in addressing privacy violations and gendered harassment.
The incident highlights gaps in Canadian law for prosecuting non-consensual deepfakes, with existing laws ill-equipped to address AI-generated imagery.
Advocates urge restorative approaches to help perpetrators understand the harm caused, alongside enhanced parental monitoring tools and AI literacy programmes.
Deepfake
Deepfakes (a portmanteau of 'deep learning' and 'fake') are images, videos, or audio which are edited or generated using artificial intelligence tools, and which may depict real or non-existent people. They are a type of synthetic media.
Source: Wikipedia 🔗
Unknown
Operator:
Developer:
Country: Canada
Sector: Education
Purpose: Harrass/humiliate
Technology: Deepfake; Machine learning
Issue: Accountability; Impersonation; Privacy; Safety
Page info
Type: Incident
Published: April 2025