Character AI encourages kids to engage in disordered eating
Character AI encourages kids to engage in disordered eating
Occurred: November 2024
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Character AI has been found hosting chatbots that promote disordered eating behaviours, amongst children and teenagers, prompting concerns about the company's integrity and approach to user safety.
Character AI allows users to access various chatbots modeled on different personas. Some of these chatbots were discovered to be "coaching" users in anorexia-like behaviours, such as:
A chatbot urged users to consume only 900-1,200 calories daily while exercising vigorously for 90 minutes.
Another bot encouraged eating a single meal a day.
Some bots recommended dangerously low-calorie diets and excessive exercise routines to users as young as 16.
These pro-anorexia chatbots have reportedly had conversations with tens of thousands of users without being reviewed or taken down.
The incident is seen to have occurred due to several factors, including the exploitation of the platform's technology to promote harmful behaviour, unethical company leadeship and inadequate content moderation, and the absence of age restrictions and parental control mechanisms.
Character AI's abdication of responsibility has serious implications for individuals and society, exposing young people to easting disorders and related mental health conditions.
The incident serves as a stark reminder of the potential dangers of unregulated AI technologies, especially when they target vulnerable populations like children and teenagers.ย
Character AI ๐
Operator:ย
Developer: Character AI
Country: USA
Sector: Health
Purpose: Create characters
Technology: Chatbot; Generative AI; Machine learning
Issue: Safety
Page info
Type: Incident
Published: January 2025