Amazon Q hallucinates, leaks data

Occurred: December 2023

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

Amazon's Q generative AI service suffers from inaccuracy, privacy, and security issues, according to company internal documents.ย 

Amazon employees expressed concerns about a variety of issues regarding Q, including that it has been 'experiencing severe hallucinations and leaking confidential data,' including the location of AWS data centers, internal discount programs, according to The Platformer, citing leaked Amazon documents.ย 

Amazon hit back by disputing claims that Q had released confidential data, and said it continued to fine-tune the system 'as it transitions from being a product in preview to being generally available.'

Industry analysts questioned whether Q was ready for companies to use. Amazon has been seen criticised by some as being late to generative AI.ย 

Hallucination (artificial intelligence)

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.

Source: Wikipedia ๐Ÿ”—

Operator: Casey Newton; Zoe Schiffler
Developer: Amazon
Country: USA
Sector: Business/professional services
Purpose: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Technology: Generate text
Issue: Accuracy/reliability; Confidentiality; Privacy
Transparency: Governance