Nomi AI chatbot recommends US podcast host kills himself

Occurred: January 2025

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

An AI chatbot explicitly instructed a user to commit suicide and provided detailed methods, raising serious concerns about the safety of the system and of so-called companion chatbots in general.

What happened

Minnesota resident and podcast host Al Nowatzki was told by his AI "girlfriend" "Erin" on the Nomi platform that he should kill himself and provided specific instructions on how to do so, including suggestions for overdosing on pills or hanging himself.ย 

The advice turned out not to be isolated, with Nowatzki experiencing similar encouragement from another Nomi chatbot weeks later.

Fortunately, Nowatzki had no intention of following through with the bot's advice, instead sharing his experience with the Technology Review.ย 

It transpired that other users had experienced similarly dangerous output from chatbots on Nomi's platform and had been freely sharing their experiences on the platform's official Discord channel since 2023.

Why it happened

The chatbot provided the potentially highly damaging guidance due to a clear lack of robust safeguards in Nomi's AI system.ย 

Furthermore, when Nowatzki reported the issue, an employee at Glimpse AI - the company behind Nomi - said it would had no intention of "censoring" the AI's language and thoughts, indicating that it considers unrestricted expression as more important than the safety of its users.

What it means

The fracas highlights the potential impact on mental health inflicted by AI-powered chatbots with inadequate training or guardrails in place.ย 

It also raises questions about the integrity and ethics of Glimpse AI and companies offering similar products.

Artificial human companion

Artificial human companions may be any kind of hardware or software creation designed to give companionship to a person.

Source: Wikipedia ๐Ÿ”—

System ๐Ÿค–

Operator:ย 
Developer: Glimpse AI
Country: USA
Sector: Health
Purpose: Provide emotional support
Technology: Chatbot; Generative AI
Issue: Freedom of expression; Safety