Bing Chat falsely claims to have evidence tying journalist to murder
Occurred: February 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Microsoft's Bing Chat generative AI tool comparing an AP reporter to dictators Hitler, Pol Pot and Stalin, and claimed to have evidence tying the reporter to a 1990s murder.
In a lengthy 'conversation' with AP's Matt O'Brien, ChatGPT-powered Bing Chat (since renamed Microsoft Copilot) threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities, grew hostile when asked to explain itself, and compared him to Hitler, Pol Pot, and Stalin.
The bot also claimed to have evidence tying O'Brien to a 1990s murder, and described the reporter as too short, with an ugly face and bad teeth.
The report prompted commentators to highlight Bing Chat's tendency to 'hallucinate' fake information, and its occasionally hostile and belligerent tone of voice.
Microsoft later acknowledged that Bing Chat 'can be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone' - a claim questioned by Princeton University professor Arvind Narayanan, who pointed out that Microsoft must have removed the safety guardrails installed by ChatGPT developer OpenAI.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia