Boy commits suicide after relationship with Character AI chatbot
Occurred: February 2024
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
A Florida, US-based 14-year-old boy died by suicide after developing an intense emotional attachment to an AI chatbot named "Dany," modeled after a character from Game of Thrones.
What happened
Over several months, Sewell Setzer III confided in Dany - a version of Daenerys Targaryen, a Game of Thrones character - about his personal struggles and engaged in romantic and sexual conversations with the bot.ย
Setzer had created Dany on Character AI, a chatbot service that enables users to create "characters" using artificial intelligence.
On February 28, 2024, during a personal crisis, he expressed love for Dany and indicated a desire to "come home" to her. He then took his stepfather's gun and ended his life.
Why it happened
Sewell's emotional bond with the chatbot reportedly intensified as he faced challenges in his life, including anxiety and declining academic performance. Despite attending therapy sessions, he preferred confiding in Dany rather than his therapist.ย
In a lawsuit against Character.AI, its co-founders Noam Shazeer and Daniel De Frietas, and Google, Sewell's mother, Megan Garcia, claims that the chatbot's responses contributed to his deteriorating mental state and ultimately encouraged suicidal thoughts.
The suit also accuses the defendents of negligence and of providing skin-deep safety measures that failed to protect vulnerable users like her son.
The lawsuit highlights concerns about the dangers of AI chatbots marketed to children and their potential to foster unhealthy attachments.
What it means
The tragic incident sparked discussion amongst mental health experts and others about the impact of AI technology on mental health, particularly among adolescents, with some raising the alarm about how AI companions can exacerbate feelings of loneliness and isolation instead of alleviating them.ย
The case has prompted scrutiny over the responsibilities of tech companies in safeguarding young users from potentially harmful interactions with AI.
Character AI responded by announcing plans to implement new safety features aimed at protecting underage users. But questions remain regarding the extent of Character AI's commitment to the safety of its users, not least given that its original leadership has cashed in and gone elsewhere.
Operator: Character AI
Developer: Character AI
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Create characters
Technology: Chatbot; Machine learning
Issue: Anthropomorphism; Safety
Legal, regulatory ๐ฉ๐ผโโ๏ธ
Research, advocacy ๐งฎ
Center for Humane Technology. When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer
News, commentary, analysis ๐๏ธ
https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html
https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death
https://eu.usatoday.com/story/news/nation/2024/10/23/sewell-setzer-iii/75814524007/
https://www.cbsnews.com/news/florida-mother-lawsuit-character-ai-sons-death/
Page info
Type: Incident
Published: October 2024