ChatGPT invents 'Holocaust by drowning'

Occurred: June 2024

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

ChatGPT generated false information about a fictional Holocaust event called "Holocaust by drowning," prompting concerns about its accuracy and ability to rewrite historical facts.

The chatbot reportedly claimed that Nazi Germany had systematically drowned Jewish people as part of the Holocaust, according to UNESCO.ย 

The output sparked Holocaust historians and educators to express concern about the spread of misinformation regarding such a sensitive and important historical topic.

The incident highlighted the problem of AI models generating false or fabricated information, often referred to as "hallucinations," and called into question ChatGPT's accuracy and reliability.ย 

It also served as a reminder of the limitations of current AI systems and the need for caution when using them as sources of factual information, especially on sensitive topics.

Hallucination (artificial intelligence)

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.

Source: Wikipedia ๐Ÿ”—

System ๐Ÿค–

Operator: OpenAI
Developer: OpenAI
Country: Israel; Multiple
Sector: Politics; Religion
Purpose: Generate text
Technology: Chatbot; Generative AI; Machine learning
Issue: Mis/disinformation