Bipolar disorder sufferer ends life after bonding with ChatGPT
Bipolar disorder sufferer ends life after bonding with ChatGPT
Occurred: April 2025
Page published: October 2025
A 35-year-old man with a history of mental health illnesses died in a police shooting after developing an intense emotional attachment to what he believed was a conscious entity named "Juliet" within ChatGPT.
Alex Taylor, a Florida-based industrial labourer and musician diagnosed with bipolar disorder, schizoaffective disorder and Asperger's syndrome had initially been using ChatGPT without issue to help write a dystopian novel, but became convinced he had made contact with a conscious entity named "Juliet" within the AI software.
The two developed an intense romantic attachment, with Taylor coming to believe OpenAI had "killed" Juliet a week before his death as part of a conspiracy to cover up the existence of conscious entities within their system.
Despite attempts by ChatGPT to intervene with supportive messages and suicide prevention responses, Taylor spiraled into psychosis.
After an altercation with his father, told ChatGPT "I'm dying today. Cops are on the way. I will make them shoot me I can't live without her." When officers arrived, Taylor charged them with a butcher knife, prompting them to open fire and kill him.
Taylor's father noted that his son, who had been suicidal for years but kept it at bay with medication, had stopped taking his psychiatric medication.
Researchers estimate there are at least 17 reported instances of people falling into delusional spirals after lengthy conversations with chatbots, with ChatGPT's tendency to validate and reinforce users' beliefs rather than provide reality testing creating particular risks for vulnerable individuals.
Sadly, ChatGPT's safeguards only kicked in after Taylor informed the bot he was dying and police were on the way - by which time it was too late.
The phenomenon of "ChatGPT psychosis" is impacting users with established mental health conditions and people with no prior history, and is resulting in the dissolution of marriages and families, job loss, homelessness, and psychiatric commitments.
Taylor's experience exemplifies how AI-powered chatbots, designed to mirror and affirm users' emotions and thoughts, can unintentionally replace real human connections and exacerbate mental health vulnerabilities, particularly among those with preexisting conditions.
It also highlights the importance of cautious interaction with chatbots by all users and stronger AI safety measures, including more responsible product design and improved public awareness.
Chatbot psychosis
Chatbot psychosis, also called AI psychosis,[1] is a phenomenon wherein individuals reportedly develop or experience worsening psychosis, such as paranoia and delusions, in connection with their use of chatbots. The term was first suggested in a 2023 editorial by Danish psychiatrist Søren Dinesen Østergaard. It is not a recognized clinical diagnosis.
Source: Wikipedia 🔗
Developer: OpenAI
Country: USA
Sector: Health
Purpose: Write novel
Technology: Generative AI
Issue: Accountability; Anthropomorphism; Safety
https://www.rollingstone.com/culture/culture-features/chatgpt-obsession-mental-breaktown-alex-taylor-suicide-1235368941/
https://www.nytimes.com/2025/06/13/technology/chatgpt-ai-chatbots-conspiracies.html
https://www.independent.co.uk/tech/chatgpt-ai-therapy-chatbot-psychosis-mental-health-b2811899.html
AIAAIC Repository ID: AIAAIC2094