Amazon wrongly disables Echo account after hearing racial slur

Occurred: July 2023

An Amazon customer had his smart home account shut down after a delivery driver mistakenly heard a racist slur through the customer's doorbell.

An Amazon customer had his smart home account shut down for a week after a delivery driver said he heard a racist slur through the customer's doorbell, despite nobody being at home. During this period, Jackson had been unable to use his Echo smart home applications and Alexa virtual assistant software. 

Baltimore-based Microsoft engineer Jackson discovered he had been locked out of his Amazon Echo account one day after a delivery driver had dropped off a package. 

It turned out the driver had reported 'receiving racist remarks' from his Ring doorbell, although it had actually said 'Excuse me, can I help you?' to the headphone-clad driver, according to home video footage reviewed by Jackson.

Amazon failed to inform Jackson that his account had been suspended, took six days to review his complaint, did not inform him when his case was finally resolved, and did not apologise.

Databank

Operator: Brandon Jackson
Developer: Amazon
Country: USA
Sector: Consumer goods
Purpose: Provide information, services
Technology: NLP/text analysis; Natural language understanding (NLU); Speech recognition
Issue: Accuracy/reliability
Transparency: Governance

Page info
Type: Incident
Published: September 2023
Last updated: November 2023