Adult chatbot exposes 2 million AI porn womens' yearbook pictures
Adult chatbot exposes 2 million AI porn womens' yearbook pictures
Occurred: 2025
Page published: November 2025
Erotic roleplay chatbot and AI image generator Secret Desires left an unsecured cloud database containing nearly two million real and AI‑generated explicit images and videos publicly accessible, exposing them to privacy violations, harassment risks, and potential long‑term harm.
Secret Desires stored user‑uploaded photos, AI‑generated pornographic outputs, and metadata such as women’s names and in some cases their workplaces or schools, in cloud storage that was left open to the public internet.
Discovered by 404 Media, the exposed dataset included close to two million non-consenusal sexualised images and videos, ranging from screenshots of social media profiles and everyday personal photos to explicit deepfake‑style content generated via a now‑retired face‑swap feature.
At least one image was traced back to a woman’s yearbook photo, whilst some filenames suggest the sexualisation of minors, raising the risk that the dataset includes illegal child sexual abuse material alongside adult deepfakes.
The material seems to have been publicly reachable for months before being taken offline shortly after journalists notified the company, meaning it could have been copied, redistributed, or weaponised without the knowledge of victims or regulators.
The exposure violated the privacy of the victims, and leaves them open to long-term reputational, psychological and safety risks, including the targeted humiliation, harassment, doxxing, stalking, and extortion.
Secret Desires' creators Playhouse Media collected training data with almost no transparency, offered no disclosure of data sources, and operated with minimal content-safety oversight. Weak internal security allowed the photo corpus to be exposed.
The company also appears to have bypassed basic due-diligence standards around biometric data, minors’ images, and explicit-content generation.
The collection was made simple due to the accessibility of AI tools (like "nudify" apps), making the creation of non-consensual sexual images almost instantaneous and requiring little to no technical skill.
More broadly, regulators have not yet imposed consistent rules around non‑consensual deepfake pornography, allowing services that advertise “limitless” or “uncensored” image generation to flourish while offloading nearly all risk onto victims.
Law enforcement and civil lawsuits are only beginning to target operators of deepfake‑porn sites, creating a lag in deterrence that incentivises rapid growth and data harvesting over privacy, safety, and transparency.
For victims: It means a profound and lasting loss of bodily autonomy, digital safety, and privacy. The damage is a form of sexual abuse that is difficult to undo, as images, once on the internet, are nearly impossible to completely remove. Victims are often forced to take on the emotional and logistical burden of reporting and attempting to have the content "taken down."
For women and girls: The pervasive threat creates a chilling effect, causing many to limit their presence online, restrict what photos they share, and live with an underlying fear of their image being exploited. This is a significant setback for gender equality in digital spaces.
For society: The phenomenon signals a serious crisis in which AI technologies have outpaced law. It points to legal gaps, with many jurisdictions criminalising the distribution of intimate images but not their creation, leaving victims unable to seek justice against the original fabricator. The ease of creating "on-demand" sexual images also reinforces a cultural problem of sexual objectification and entitlement, undermining the necessity of consent and genuine human connection.
Developer: Playhouse Media LLC
Country: Global
Sector: Media/entertainment/sports/arts
Purpose: Build AI companion
Technology: Deepfake
Issue: Accountability; Privacy/surveillance; Safety; Security; Transparency
AIAAIC Repository ID: AIAAIC2141