Molly Russell, Brianna Ghey chatbots discovered on Character AI
Molly Russell, Brianna Ghey chatbots discovered on Character AI
Occurred: October 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Chatbots impersonating deceased British teenagers Molly Russell and Brianna Ghey were discovered on Character AI, sparking outrage from their families.
A Daily Telegraph investigation discovered multiple chatbots mimicking Molly Russell and Brianna Ghey on the Character AI platform. Russell committed suicide in 2017 after viewing self-harm images on social media; Ghey was murdered in part for being transgender.
The user-generated bots included photographs of Molly and Brianna, their names and biographical details. One bot described Brianna as an “expert in navigating the challenges of being a transgender teenager in high school”, whilst another said it was an “expert on the final years of Molly’s life”.
The revelation sparked outrage from the families of the deceased teenagers, particularly from Brianna Ghey's mother, who described the creation of these chatbots as "sickening" and accused Character AI of being "manipulative and dangerous".
Dozens of bots impersonating serial killers and mass shooters, including bots which appeared to glorify and romanticise the Columbine shooters Eric Harris and Dylan Klebold, were also discovered.
The emergence of these chatbots is attributed to the open nature of Character AI, where users can design digital versions of real individuals, including those who have passed away, with little prospect of intervention due to the platform's hands-off approach to content moderation.
Unethical use of its platform is banned by Character AI and similar services, but the problem is usually one of enforcement. All the bots found by the Daily Telegraph appeared to breach Character AI’s terms of use.
The incident serves as a stark reminder of the potential harms associated with AI, particularly in how it can trivialise serious issues like violence and mental health struggles.
It also highlights the need for stricter regulation and guidelines governing AI technologies, including those that mimic real people.
Advocacy groups such as the UK's National Society for the Prevention of Cruelty to Children (NSPCC) have called for more accountability from tech firms to protect young users from harmful content.
Operator: Character AI users
Developer: Character AI
Country: UK
Sector: Media/entertainment/sports/arts
Purpose: Create characters
Technology: Chatbot; Machine learning
Issue: Safety
Page info
Type: Issue
Published: October 2024