Occurred: December 2016
Report incident 🔥 | Improve page 💁 | Access database 🔢
Amazon's Alexa voice software misinterpreted a child's prompt with a crude pornographic response, promting concerns about it's safety.
A child asked Amazon's Alexa voice software to 'Play Digger Digger', one of his favourite songs, only to be met with crude and pornographic language, to the horror of his parents.
Instead of Alexa playing the child's request, Alexa started saying 'You want to hear a station for ‘Porn detected….Porno ringtone hot chick amateur girl calling sexy,' followed by a string of crude and dirty words.
After an initial delay which could be construed as the adults not believing what they’re hearing from Alexa, they could be heard shouting “No! No! No! Alexa, stop!” to command Alexa to stop the command.
Amazon later said that they had fixed the issue and were working towards building additional restrictions to prevent this kind of incident from happening in the future.
Wilson C. (2020). Dangerous Skills Got Certified: Measuring the Trustworthiness of Amazon Alexa Platform (pdf)
Page info
Type: Incident
Published: March 2023