ChatGPT invents Guardian newspaper articles, bylines
ChatGPT invents Guardian newspaper articles, bylines
Occurred: March 2023-
Report incident 🔥 | Improve page 💁 | Access database 🔢
ChatGPT invented a series of articles and bylines by reporters at The Guardian that the newspaper never published, calling into question the system's accuracy and highlighting its tendence to 'hallucinate' facts.
The discovery happened after a journalist for the paper was contacted about an article that they could not remember writing but which involved a subject they had a record of covering.
After doing some additional research, they could not find any trace of the article’s existence, as ChatGPT had made up the reference.
According to The Guardian's head of editorial innovation Chris Moran, 'Huge amounts have been written about generative AI’s tendency to manufacture facts and events. But this specific wrinkle — the invention of sources — is particularly troubling for trusted news organizations and journalists whose inclusion adds legitimacy and weight to a persuasively written fantasy.'
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia 🔗
Operator: The Guardian
Developer: OpenAI
Country: UK; USA
Sector: Media/entertainment/sports/arts
Purpose: Generate text
Technology: Chatbot; Generative AI; Machine learning
Issue: Accuracy/reliability; Mis/disinformation
Page info
Type: Incident
Published: December 2023