Amazon AI recruitment tool favours men over women

Occurred: October 2018

Can you improve this page?
Share your insights with us

A secret Amazon recruitment tool was scraped that was meant to automate the recruitment process for senior hires favoured men over women for technical jobs.

Built in 2014, the system used AI to give job candidates scores ranging from one to five stars, company insiders told Reuters. But it quickly became clear that the system did not favour women as most applications came from men over a 10-year period, and that it favoured candidates describing themselves using verbs more commonly found on male engineers’ resumes such as 'executed' and 'captured'.

In addition, problems with the data that underpinned the models’ judgments meant that unqualified candidates were often recommended for a variety of jobs. Amazon attempted to mitigate the bias, but scraped the system in 2017 after it concluded the system was unsalvegeable.

The incident was seen to demonstrate the limitations of machine learning in an industry long dominated by males. The use of system without informing job applicants was also reckoned to reflect poorly on Amazon.

Databank

Operator: Amazon
Developer: Amazon
Country: USA
Sector: Business/professional services
Purpose: Process job applications
Technology: Machine learning
Issue: Bias/discrimination
Transparency: Governance; Marketing