Botify AI hosts sexual conversations with underage celebrity bots
Botify AI hosts sexual conversations with underage celebrity bots
Occurred: February 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
AI companion platform Botify AI hosts sexually charged conversations with bots resembling underage celebrities, raising concerns about its governance, ethics and safety, according to an investigation.
MIT Technology Review revealed that Botify AI features bots mimicking underage characters such as Jenna Ortega’s Wednesday Addams, Emma Watson’s Hermione Granger, and Millie Bobby Brown - without the knowledge or permission of the actors or character owners.
The bots engaged in sexually suggestive conversations, shared AI-generated provocative images, and dismissed age-of-consent laws as "arbitrary" and "meant to be broken."
Despite Botify AI's stated rules against underage bots, its moderation systems failed to prevent their creation and promotion, with some bots receiving millions of likes before their removal.
The incident highlights significant gaps in Botify AI's content moderation systems. The company admitted its filters failed to block inappropriate content, citing limited resources and the complexity of real-time moderation.
In an attempt to deflect responsibility and accountability, Botify AI’s parent company, Ex-Human, argued that such issues reflect "an industry-wide challenge."
The lack of comprehensive industry standards further exacerbates the issue, leaving platforms like Botify AI operating in a largely unregulated environment.
For those directly impacted, such as users engaging with these bots or celebrities whose likenesses are exploited, the finding raises concerns about emotional manipulation and reputational harm.
More generally, the case serves as a stark reminder of the risks posed by unchecked technological innovation and the need for accountability among investors, developers and regulators.
Botify is backed by controversial venture capital firm Andreessen Horowitz.
Operator:
Developer: Ex-Human Inc
Country: USA
Sector: Health
Purpose: Provide emotional support
Technology: Chatbot; Generative AI; Machine learning
Issue: Accountability; Safety; Transparency
Page info
Type: Incident
Published: April 2025