Amazon Alexa mistakes conversation for command

Occurred: May 2018

Can you improve this page?
Share your insights with us

A woman in Portland, USA, discovered that her Amazon Echo device had recorded and sent a private conversation with her husband to one of his employees in Seattle, who was in the family’s contact list. 

The recording, which had been made without their knowledge, had been about hardwood floors, but the device's Alexa voice software appears to have woken up 'due to a word in background conversation sounding like 'Alexa,'' and had intrepeted it as a series of commands to send their conversation to the man in Seattle, according to Amazon.

The couple had only found out about the mistake when the employee called and advised them to 'Unplug your Alexa devices right now ... You’re being hacked.'

Developer: Amazon
Country: UK
Sector: Consumer goods
Purpose: Interact with users
Technology: Speech recognition; Natural language understanding (NLU)
Issue: Privacy
Transparency: Governance