OpenDream AI art generator accused of generating child sex images
OpenDream AI art generator accused of generating child sex images
Occurred: October 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
AI art generation platform OpenDream has been accused of enabling the creation and public display of child sexual abuse material (CSAM) and non-consensual explicit deepfakes, raising serious questions about the governance, leadership and ethics of the people behind it.
Marketed as an AI art generator, OpenDream was discovered by investigative outfit Bellingcat to host publicly accessible galleries containing AI-generated CSAM and sexually explicit deepfakes of celebrities.
Users exploited the platform’s tools to create synthetic images depicting children in sexualised scenarios, with prompts explicitly referencing minors and sexual acts.
The images were visible for months without moderation, indexed by search engines, and accessible without login requirements.
OpenDream failed to implement adequate moderation systems to prevent the misuse of its AI tools for generating illegal and harmful content.
For victims depicted in these images - whether real or synthetic - the harms are profound, including psychological distress and trauma.
The incident underscores the urgent need for stricter regulations and industry-wide safeguards against misuse of generative AI technologies.
Generative art
Generative art is post-conceptual art that has been created (in whole or in part) with the use of an autonomous system.
Source: Wikipedia🔗
Operator:
Developer: CBM Media
Country: Vietnam
Sector: Media/entertainment/sports/arts
Purpose: Generate artwork
Technology: Generative AI; Machine learning
Issue: Accountability; Privacy; Safety
Page info
Type: Incident
Published: April 2025