Belgian man commits suicide after bot relationship

Occurred: March 2023

Report incident πŸ”₯ | Improve page πŸ’ | Access database πŸ”’

A Belgian man committed suicide after a having a relationship with a chatbot, raising concerns about the safety of the bot and the nature of human-robot relationships.

The patient, who reputedly had become depressed about climate change, had used the 'Eliza' bot for around six weeks to express his concerns.Β 

Over time, the conversations had become increasingly unsafe, with the chatbot telling Pierre that his wife and children are dead and that 'We will live together, as one person, in paradise.'Β 

Eliza is the default bot for the Chai app, which allows users to choose different AI avatars with different personalities to speak to. Chai had been trained on GPT-J, an open-source large language model developed by EleutherAI.

Pierre's widow and psychiatrist felt the chatbot was partly responsible.Β 

Critics attributed the problem to the victim's anthropomorphism of the bot and the patient's consequent increasing detachment from reality.

They also accused the bot's developers of failing to install adequate safety guardrails.

Anthropomorphism

Anthropomorphism is the attribution of human traits, emotions, or intentions to non-human entities. It is considered to be an innate tendency of human psychology.

Source: Wikipedia πŸ”—

System πŸ€–


Documents πŸ“ƒ

Operator: Chai Research
Developer: Chai Research; EleutherAI

Country: Belgium

Sector: Multiple; Media/entertainment/sports/arts

Purpose: Provide information, communicate

Technology: Chatbot; Machine learning
Issue: Accountability; Anthropomorphism; Safety

Page info
Type: Incident
Published: April 2023