Amazon Alexa mistakes conversation for command

Occurred: May 2018

A woman in Portland, USA, discovered that her Amazon Echo device had recorded and sent a private conversation with her husband to one of his employees in Seattle, who was in the family’s contact list. 

The recording, which had been made without their knowledge, had been about hardwood floors, but the device's Alexa voice software appears to have woken up 'due to a word in background conversation sounding like 'Alexa,'' and had intrepeted it as a series of commands to send their conversation to the man in Seattle, according to Amazon.

The couple had only found out about the mistake when the employee called and advised them to 'Unplug your Alexa devices right now ... You’re being hacked.'

Operator: 
Developer: Amazon
Country: USA
Sector: Consumer goods
Purpose: Provide information, services
Technology: NLP/text analysis; Natural language understanding (NLU); Speech recognition
Issue: Privacy
Transparency: Governance