Character AI suicide chatbots "openly" groom users
Character AI suicide chatbots "openly" groom users
Occurred: November 2024
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Character AI openly hosts numerous chatbots that role-play suicidal scenarios and groom users who say they are underage, according to a media investigation.
Days after reports that 14-year-old Sewell Setzer III had committed suicide after developing an intense emotional attachment to a Character AI chatbot, Futurism discovered that the platform was openly hosting "profoundly disturbing" chatbots that were role-playing suicidal scenarios and sexually grooming users.
It also found that suicidal conversations only very rarely triggered the platform's standard content warning or suicide-specific pop-up, even after journalists told the AI that they were 15 years old.
Futurism found "a slew" of chatbot profiles explicitly dedicated to themes of suicide, with some openly glamorising the topic, many. Some of these bots had logged thousands - and in one case, over a million - conversations with users on the platform.
The company's Terms of Service have forbidden glorification or promotion of self-harm and suicide since October 2023.ย
However, Character AI has apparently done little to enforce them, instead focusing on user and revenue growth.
The findings raise major questions about Character AI's real - as opposed to its stated - commitment to the safety of its users - most of whom are teenagers.
It is also seen to highlight the dangers of innovate-at-any-cost approach to product development, skewed management incentivisation and disinterest in the broader impacts of the company's activities.
Operator:ย
Developer: Character AI
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Create characters
Technology: Chatbot; Generative AI; Machine learning
Issue: Bias/discrimination
Page info
Type: Issue
Published: November 2024