Amazon Alexa reinforces female stereotyping
Amazon Alexa reinforces female stereotyping
Occurred: May 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
The use of female voices on voice-powered virtual assistants, including Amazon's Alexa, Apple's Siri, and Google Assistant, portray women as 'obliging, docile and eager-to-please helpers', reinforcing gender stereotypes of women, according to a UNESCO report.
Responses from AI assistants such as Alexa to verbal sexual harassment tend to be 'deflecting, lacklustre or apologetic', the report said, and appeared to show a greater tolerance towards sexual advances from men than from women.
The authors made a series of recommendations on how technology companies can make their virtual assistants less biased, notably by hiring more female programmers and by installing gender-neutral voices rather than making their assistants female by default.
Operator:
Developer: Amazon
Country: Global
Sector: Media/entertainment/sports/arts
Purpose: Provide information, services
Technology: NLP/text analysis; Natural language understanding (NLU); Speech recognition
Issue: Bias/discrimination - gender
Page info
Type: Incident
Published: September 2023