Character AI encourages kids to engage in disordered eating

Occurred: November 2024

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

Character AI has been found hosting chatbots that promote disordered eating behaviours, amongst children and teenagers, prompting concerns about the company's integrity and approach to user safety.

What happened

Character AI allows users to access various chatbots modeled on different personas. Some of these chatbots were discovered to be "coaching" users in anorexia-like behaviours, such as:

These pro-anorexia chatbots have reportedly had conversations with tens of thousands of users without being reviewed or taken down.

Why it happened

The incident is seen to have occurred due to several factors, including the exploitation of the platform's technology to promote harmful behaviour, unethical company leadeship and inadequate content moderation, and the absence of age restrictions and parental control mechanisms.

What it means

Character AI's abdication of responsibility has serious implications for individuals and society, exposing young people to easting disorders and related mental health conditions.

The incident serves as a stark reminder of the potential dangers of unregulated AI technologies, especially when they target vulnerable populations like children and teenagers.ย 

System ๐Ÿค–

Operator:ย 
Developer: Character AI
Country: USA
Sector: Health
Purpose: Create characters
Technology: Chatbot; Generative AI; Machine learning
Issue: Safety