Workday accused of building discriminatory AI job screening system
Workday accused of building discriminatory AI job screening system
Occurred: 2020-
Page published: February 2023 | Page last updated: August 2023
Business and IT services company Workday stands accused in a class-action lawsuit of building an AI-powered job screening tool that discriminates against elder, Black, disabled people.
US job seeker Derek Mobley filed a federal lawsuit in California claiming that Workday’s algorithmic screening tools, which are used by thousands of major companies to recruit staff, disproportionately rejected his applications.
Despite having relevant qualifications and applying for over 100 roles, Mobley (who is Black, over 40, and has anxiety/depression) received near-instant rejections, often in the middle of the night.
A federal judge later granted preliminary certification for the case to proceed as a nationwide collective class-action.
The potential impact is vast: the court noted the system could have affected hundreds of millions of applicants aged 40 and older who were denied "employment recommendations" by Workday’s AI since 2020.
The harm centres on "disparate impact," where a facially neutral automated system creates a discriminatory barrier that prevents qualified candidates from even getting their "foot in the door."
The incident stems from the way AI models are trained and integrated into the hiring funnel.
The lawsuit alleges the AI was trained on data from existing, homogenous workforces, causing the algorithm to favor candidates who look like current employees and penalize those with different career patterns or backgrounds.
The system may use "proxies" for protected traits; for example, graduation dates can serve as a proxy for age, and certain personality assessments can inadvertently screen out neurodivergent or disabled candidates.
Many rejections occurred automatically without any human recruiter reviewing the applicant’s skills, turning the AI into a "black box" gatekeeper.
Workday initially argued it was merely a software vendor, not an employer. However, the court ruled that because the AI "participates in the decision-making process" by ranking and rejecting candidates, the company could be held liable as an "agent" of the employers.
For job seekers: The case exposes how "invisible" AI bias can lead to systemic exclusion. Unlike traditional discrimination, which might be limited to a single biased manager, a single flawed algorithm can cause harm across thousands of companies simultaneously.
For society: It challenges the assumption that automation is more "objective" than humans. It reinforces the need for "human-in-the-loop" systems where technology assists rather than replaces judgment in high-stakes life areas like employment.
For policymakers: The case signals to regulators and legislators that AI vendors - not just the companies buying the software - may be held legally responsible for discriminatory outcomes. It accelerates the push for mandatory bias audits, transparency disclosures, and stricter enforcement of civil rights laws in the digital age.
Candidate Skills Match
Developer: Workday
Country: USA
Sector: Business/professional services
Purpose: Screen job applicants
Technology: Machine learning
Issue: Accountability; Fairness; Transparency
Mobley vs Workday Inc
AIAAIC Repository ID: AIAAIC0956