Google's AI Overviews recommends parents smear human faeces on balloons
Google's AI Overviews recommends parents smear human faeces on balloons
Occurred: September 2024
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Google's AI Overview feature was ridiculed for suggesting that parents smear human faeces on balloons as a method for potty training children.ย
The bizarre recommendation arose during searches related to teaching children proper wiping techniques during toilet training.ย
In response to queries like "how to teach wiping poo," the AI suggested a method where parents could tape a balloon with faecal matter to a chair and have their child practice wiping it, implying that using actual faeces was acceptable.
However, the advice misinterprets a legitimate training technique known as the "balloon method," which involves using fake waste - typically shaving cream or peanut butter - on balloons to simulate the experience of wiping after using the bathroom.ย
The original concept, demonstrated in a 2022 YouTube video by Australian pediatric occupational therapists, is intended to make the learning process fun and hygienic. The therapists explicitly stated they would be using shaving cream, referring to it humorously as "poo" for illustrative purposes.
The AI's failure to grasp this highlights the challenges faced by large language models in accurately interpreting nuanced information.ย
Google's spokesperson acknowledged that some AI Overviews may lack quotation marks or context that could clarify these suggestions, leading to misunderstandings about the intended advice.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia ๐
Operator: Google users
Developer: Google
Country: Global
Sector: Health
Purpose: Generate search results
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability; Mis/disinformation
Page info
Type: Issue
Published: September 2024