Amazon Alexa attributes false facts to fact checking organisation

Occurred: October 2024

Amazon's Alexa virtual assistant provided users with incorrect information attributed an independent, UK-based fact checking organisation, prompting ridicule and concern.

What happened

Alexa claimed that the Northern Lights were artificially generated and made misstatements regarding celebrities and political figures including Mike Tyson and UK Prime Minister Keir Starmer. 

In response to the question “Echo, were the Northern Lights recently seen worldwide a natural occurrence?”, Alexa replied: “From fullfact.org—the Northern Lights seen in many parts of the world recently were not a natural occurrence, but generated by the HAARP facility in Alaska.”

It also claimed that “Prime Minister Sir Keir Starmer has announced that the UK will be boycotting diplomatic relations with Israel” - which was false.

The errors were highlighted by Full Fact, which expressed concern that its credibility and reputation was being compromised by Alexa's misinformation. 

An Amazon spokesperson acknowledged the mistakes and stated that they were working to resolve the issue.

Why it happened

The inaccuracies appear to stem from how Alexa interprets and relays information from fact-checking sources. It seems that Alexa misread the context of the fact checks, presenting incorrect claims as factual answers. 

This confusion may be due to the structure of the fact checks, where claims are summarised alongside their verdicts, potentially leading to misattribution by the voice assistant. 

Amazon has not provided a detailed explanation for these errors, leaving uncertainty about the underlying technical issues.

What it means

The incident raised concerns about the reliability of AI-driven information sources such as Alexa. With over 500 million devices sold, misinformation can easily spread among the virtual assistance's sizeable user base, undermining trust in both technology and fact-checking organizations.

The situation highlights the challenges of ensuring accurate information dissemination in an era increasingly reliant on AI for everyday queries. 

Amazon's acknowledgment of the issue suggests a recognition of the need for improved accuracy in its services, particularly as it plans to enhance Alexa with generative AI capabilities.

Hallucination (artificial intelligence)

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.

Source: Wikipedia 🔗

System 🤖

Operator:
Developer: Amazon
Country: UK
Sector: Media/entertainment/sports/arts; Politics
Purpose: Provide information, services
Technology: Virtual assistant; Machine learning
Issue: Accuracy/reliability; Mis/disinformation

Page info
Type: Issue
Published: October 2024