Stalker uses Sora 2 to harass technology journalist
Stalker uses Sora 2 to harass technology journalist
Occurred: October 2025
Page published: October 2025
A prominent technology journalist was harassed by a stalker using OpenAI's Sora 2 video generator to create and distribute AI-manipulated nude deepfake videos of her likeness.
The stalker used Sora 2 to target high-profile technology journalist Taylor Lorenz, employing it to generate and share fake nude videos featuring her image without consent.
The perpetrator also operated numerous online accounts dedicated to Lorenz, amplifying the harassment and increasing the risks to her safety and privacy.
The app’s weak content safeguards allowed the stalker to bypass restrictions and produce wholly inappropriate and vexatious videos.
The incident was made possible by Sora 2's weak content moderation safeguards, which enable users to create offensive and damaging deepfake videos of real individuals with few constraints.
The paucity of safeguards is made worse by the poor visibility of OpenAI's complaint and appeal mechanisms, thereby encouraging other bad actors to operate more or less entirely unchallenged.
The incident demonstrates the dangers posed by Sora and other advanced generative AI tools when proper safeguards are lacking.
It also highlights the potential and actual damage caused by organisations whose hunger for revenue, market share and publicity overrides the safety of their users and the general public.
Developer: OpenAI
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Harass individual
Technology: Deepfake; Generative AI; Machine learning
Issue: Accountability; Safety; Transparency
AIAAIC Repository ID: AIAAIC2047