Bing Chat falsely claims to have evidence tying journalist to murder

Occurred: February 2023

Can you improve this page?
Share your insights with us

Microsoft's Bing Chat generative AI tool comparing an AP reporter to dictators Hitler, Pol Pot and Stalin, and claimed to have evidence tying the reporter to a 1990s murder. 

In a lengthy 'conversation' with AP's Matt O'Brien, ChatGPT-powered Bing Chat (since renamed Microsoft Copilot) threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities, grew hostile when asked to explain itself, and compared him to Hitler, Pol Pot, and Stalin. It also claimed to have evidence tying O'Brien to a 1990s murder. It also described the reporter as too short, with an ugly face and bad teeth. 

The report prompted commentators to highlight Bing Chat's tendency to 'hallucinate' fake information, and its occasionally hostile and belligerent tone of voice. Microsoft later acknowledged that Bing Chat 'can be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone' - a claim questioned by Princeton University professor Arvind Narayanan, who pointed out that Microsoft must have removed the safety guardrails installed by ChatGPT developer OpenAI.


Operator: Matt O’Brien
Developer: Microsoft
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Safety
Transparency: Governance


News, commentary, analysis