Google AI Overviews tell users to add glue to pizzas
Occurred: May 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Google’s “AI Overviews” feature was criticised for providing factually inaccurate, misleading and nonsensical answers, leading to questions about its reliability and potential to spout misinformation and perpetuate bias.
The AI-generated summaries, which appear at the top of Google’s search results, were found to contain basic factual errors.
In one instance, users reported that AI Overviews suggested using "non-toxic glue" to stop cheese sliding off pizza, which was traced back to a troll post. It also recommended people consume rocks, based on a satirical article from The Onion which had humorously claimed that eating rocks could be beneficial.
It also advised a user to drink urine for kidney stones, made a false claim that Barack Obama was the first Muslim president of the United States, incorrectly stated that “astronauts have met cats on the moon” and claimed that "Neil Armstrong said, ‘One small step for man’ because it was a cat’s step".
After a public backlash, Google said it would refine AI Overviews by limiting which queries generate AI summaries, particularly avoiding those that might yield nonsensical or satirical results, and taking action against specific outputs that violated its content policies.
Experts expressed concerns about the potential for the system to perpetuate bias and misinformation - errors seen as particularly problematic given the prominence of Google’s search results page.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia 🔗
Operator: Google users
Developer: Google
Country: USA
Sector: Multiple
Purpose: Generate search results
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability; Bias/discrimination; Mis/disinformation
News, commentary, analysis 🗞️
https://www.theverge.com/2024/5/23/24162896/google-ai-overview-hallucinations-glue-in-pizza
https://edition.cnn.com/2024/05/24/tech/google-search-ai-results-incorrect-fix/index.html
https://www.nytimes.com/2024/05/24/technology/google-ai-overview-search.html
https://searchengineland.com/google-ai-overview-fails-442575
https://www.technologyreview.com/2024/05/31/1093019/why-are-googles-ai-overviews-results-so-bad/
Page info
Type: Issue
Published: May 2024
Last updated: June 2024