Molly Russell, Brianna Ghey chatbots discovered on Character.AI
Molly Russell, Brianna Ghey chatbots discovered on Character.AI
Occurred: October 2024
Page published: October 2024
Character.AI sparked outrage after users discovered chatbots impersonating deceased teenagers Molly Russell and Brianna Ghey, causing real distress to their families and highlighting a "reprehensible" failure of moderation.
A Daily Telegraph investigation discovered multiple chatbots mimicking Molly Russell and Brianna Ghey on the Character.AI platform. Russell committed suicide in 2017 after viewing self-harm images on social media; Ghey was murdered in part for being transgender.
The user-generated bots included photographs of Molly and Brianna, their names and biographical details. One bot described Brianna as an “expert in navigating the challenges of being a transgender teenager in high school”, whilst another said it was an “expert on the final years of Molly’s life”.
The revelation sparked outrage from the families of the deceased teenagers, particularly from Brianna Ghey's mother, who described the creation of these chatbots as "sickening" and accused Character.AI of being "manipulative and dangerous".
Dozens of bots impersonating serial killers and mass shooters, including bots which appeared to glorify and romanticise the Columbine shooters Eric Harris and Dylan Klebold, were also discovered.
The emergence of these chatbots is attributed to the open nature of Character AI, where users can design digital versions of real individuals, including those who have passed away, with little prospect of intervention due to the platform's hands-off approach to content moderation.
Unethical use of its platform is banned by Character AI and similar services, but the problem is usually one of enforcement. All the bots found by the Daily Telegraph appeared to breach Character AI’s terms of use.
The incident serves as a stark reminder of the potential harms associated with AI, particularly in how it can trivialise serious issues like violence and mental health struggles.
It also highlights the need for stricter regulation and guidelines governing AI technologies, including those that mimic real people.
Advocacy groups such as the UK's National Society for the Prevention of Cruelty to Children (NSPCC) have called for more accountability from tech firms to protect young users from harmful content.
Developer: Character.AI
Country: UK
Sector: Media/entertainment/sports/arts
Purpose: Create characters
Technology: Chatbot; Machine learning
Issue: Consent; Safety
2017/2023. Deaths of Molly Russell and Brianna Ghey, respectively.
January 2024. Media investigations reveal dozens of abusive and harmful chatbots on Character.AI.
October 2024. The Washington Post and The Daily Telegraph report on chatbots impersonating Jennifer Crecente, Molly Russell, and Brianna Ghey.
October 2024. Character.AI removes the identified bots and issues a statement on safety improvements.
December 2024. Character.AI introduces new safety features, including a dedicated model for users under 18 with stricter filters.
January 2026. Google and Character.AI move to settle lawsuits related to teen safety and chatbot dependency.
AIAAIC Repository ID: AIAAIC1793