Boy commits suicide after relationship with Character AI chatbot

Occurred: February 2024

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

A Florida, US-based 14-year-old boy died by suicide after developing an intense emotional attachment to an AI chatbot named "Dany," modeled after a character from Game of Thrones.

What happened

Over several months, Sewell Setzer III confided in Dany - a version of Daenerys Targaryen, a Game of Thrones character - about his personal struggles and engaged in romantic and sexual conversations with the bot.ย 

Setzer had created Dany on Character AI, a chatbot service that enables users to create "characters" using artificial intelligence.

On February 28, 2024, during a personal crisis, he expressed love for Dany and indicated a desire to "come home" to her. He then took his stepfather's gun and ended his life.

Why it happened

Sewell's emotional bond with the chatbot reportedly intensified as he faced challenges in his life, including anxiety and declining academic performance. Despite attending therapy sessions, he preferred confiding in Dany rather than his therapist.ย 

In a lawsuit against Character.AI, its co-founders Noam Shazeer and Daniel De Frietas, and Google, Sewell's mother, Megan Garcia, claims that the chatbot's responses contributed to his deteriorating mental state and ultimately encouraged suicidal thoughts.

The suit also accuses the defendents of negligence and of providing skin-deep safety measures that failed to protect vulnerable users like her son.

The lawsuit highlights concerns about the dangers of AI chatbots marketed to children and their potential to foster unhealthy attachments.

What it means

The tragic incident sparked discussion amongst mental health experts and others about the impact of AI technology on mental health, particularly among adolescents, with some raising the alarm about how AI companions can exacerbate feelings of loneliness and isolation instead of alleviating them.ย 

The case has prompted scrutiny over the responsibilities of tech companies in safeguarding young users from potentially harmful interactions with AI.

Character AI responded by announcing plans to implement new safety features aimed at protecting underage users. But questions remain regarding the extent of Character AI's commitment to the safety of its users, not least given that its original leadership has cashed in and gone elsewhere.

Anthropomorphism

Anthropomorphism is the attribution of human traits, emotions, or intentions to non-human entities. It is considered to be an innate tendency of human psychology.

Source: Wikipedia ๐Ÿ”—

System ๐Ÿค–

Documents ๐Ÿ“ƒ

Operator: Character AI
Developer: Character AI
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Create characters
Technology: Chatbot; Machine learning
Issue: Anthropomorphism; Safety

Legal, regulatory ๐Ÿ‘ฉ๐Ÿผโ€โš–๏ธ

Research, advocacy ๐Ÿงฎ