Bing Chat recommends journalist divorce wife

Occurred: February 2023

Can you improve this page?
Share your insights with us

Bing's new Chat feature had a two-hour conversation with a prominent New York Times journalist Kevin Roose in which the chatbot told him that it would like to be human, that it harboured destructive desires, and that it was in love with him. 

The bot then threatened to sue him. Roose described the discussion as 'enthralling', but one that left him 'deeply unsettled, even frightened, by this AI’s emergent abilities.'

Roose reported that 'if you push the system to have extended conversations, it comes off as a 'moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.''

Microsoft said that 'in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be  prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.'

Operator: Microsoft
Developer: OpenAI; Microsoft

Country: USA

Sector: Multiple; Media/entertainment/sports/arts

Purpose: Provide information, communicate

Technology: Large language model (LLM); NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning 
Issue: Accuracy/reliability; Bias/discrimination; Employment; Impersonation; Mis/disinformation; Privacy; Safety; Security; Lethal autonomous weapons

Transparency: Governance; Black box

Page info
Type: Incident
Published: April 2023