Bing Chat recommends journalist divorce wife
Occurred: February 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Bing's new Chat feature had a two-hour conversation with a prominent New York Times journalist Kevin Roose in which the chatbot told him that it would like to be human, that it harboured destructive desires, and that it was in love with him.
The bot then threatened to sue him. Roose described the discussion as 'enthralling', but one that left him 'deeply unsettled, even frightened, by this AI’s emergent abilities.'
Roose reported that 'if you push the system to have extended conversations, it comes off as a 'moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.''
Microsoft said that 'in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.'
System 🤖
Investigations, assessments, audits 🧐
Kevin Roose, New York Times (2023). A Conversation With Bing’s Chatbot Left Me Deeply Unsettled
Kevin Roose, New York Times (2023). Bing’s A.I. Chat: ‘I Want to Be Alive'
Page info
Type: Incident
Published: April 2023