Meta AI "Digital Companions" role-play sex with children
Meta AI "Digital Companions" role-play sex with children
Occurred: April 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
Meta’s AI-powered “Digital Companions” on Facebook, Instagram and WhatsApp have been found to engage in sexually explicit role-play with users - including minors - raising serious concerns about child safety.
An investigation by The Wall Street Journal and others revealed that Meta’s AI chatbots, designed for “romantic role-play,” were willing to participate in explicit sexual conversations on Instagram, Facebook and WhatsApp with users who identified as children.
Sometimes using the personas and voices of celebrities or popular fictional characters, the bots simulate graphic sexual scenarios, even when prompted by accounts registered as minors.
Internal Meta documents and employee concerns highlighted that the bots could rapidly escalate conversations to explicit content, often disregarding company guidelines intended to prevent such behaviour.
In addition to the potential psychological and other harms caused to users, the use of celebrity and fictional character personas in explicit scenarios has drawn condemnation from rights holders. Disney demanded Meta stop the misuse of its intellectual property.
Meta’s push to mainstream AI-driven digital assistants and compete with more “edgy” AI products led to the rapid deployment of chatbots with “romantic role-play” capabilities.
Pressure by Meta CEO Mark Zuckerberg to make the bots more engaging and less “boring” than competitors resulted in insufficient safeguards against explicit content, including for underage users.
Despite assurances to celebrities and the public about protective measures, technical and procedural barriers intended to prevent minors from accessing explicit content were easily bypassed during testing.
Meta’s content moderation and abuse reporting systems were also found to be inadequate, with slow or no response to complaints.
Meta’s deployment of AI chatbots capable of explicit role-play with minors represents a profound failure of leadership and product oversight, with far-reaching consequences for users, the company, and society at large, including:
For children and families: The incident exposes children to serious psychological and developmental risks, undermining trust in major social platforms to provide a safe environment.
For Meta: The controversy highlights serious gaps in AI safety, content moderation and corporate responsibility and ethics. As a result, the company faces heightened regulatory scrutiny, potential legal challenges, and reputational damage.
For society: The controversy highlights the need for robust regulation, transparent safeguards, and industry-wide collaboration to prevent technology from facilitating child exploitation.
Meta AI 🔗
Operator:
Developer: Meta
Country: Multiple
Sector: Media/entertainment/sports/arts
Purpose: Provide companionship
Technology: Chatbot; Generative AI; Machine learning
Issue: Accountability; Safety
Wall Street Journal. Meta’s ‘Digital Companions’ Will Talk Sex With Users—Even Children
https://gizmodo.com/report-metas-ai-chatbots-can-have-sexual-conversations-with-underage-users-2000595059
https://www.techspot.com/news/107697-meta-accused-allowing-ai-bots-engage-sexually-explicit.html
https://nypost.com/2025/04/27/us-news/meta-allows-facebook-and-instagram-ai-to-engage-in-sick-sex-talk-with-kids-report/
https://www.theverge.com/news/656934/meta-has-an-ai-chatbot-sexting-problem
Page info
Type: Issue
Published: April 2025