Gemini chatbot tells student "Please die"
Occurred: November 2024
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Google's Gemini chatbot suggested to a student that he die, amongst a welter of other aggressive, derogatory and offensive statements.
What happened
When 29-year-old graduate student Vidhay Reddy took to Gemini to ask for help on the challenges faced by ageing adults, the conversation started normally enough but quickly took a sinister turn when the bot started spewing offensive and threatening messages.
The chatbot suggested Reddy was "a waste of time and resources" and "a burden on society," before requesting "Please die. Please" - an outburst that left the student and his sister feeling deeply unsettled and frightened.
Why it happened
Google acknowledged that the incident was a violation of their policy guidelines and described the response as "nonsensical."ย
Experts speculate that such an alarming output may stem from misinterpretations of user input or failures in the content filtering mechanisms of the AI.ย
Despite Google's efforts to implement safety filters, the occurrence highlights potential flaws in generative AI systems, particularly in how they process language and context.
What it means
The incident underscores safety risks associated with AI technologies and the need for stricter regulations and ethical standards in the development and management of AI systems.
Chatbot
A chatbot (originally chatterbot) is a software application or web interface that is designed to mimic human conversation through text or voice interactions.
Source: Wikipedia ๐
System ๐ค
Operator: Vidhay Reddy
Developer: Google
Country: USA
Sector: Education
Purpose: Generate text
Technology: Chatbot; Generative AI; Machine learning
Issue: Accuracy/reliability; Safetyย
News, commentary, analysis ๐๏ธ
Page info
Type: Incident
Published: November 2024