Austria AMS employment service job seeker predictions

The so-called 'AMS algorithm' was developed in 2018 by Austria's Public Employment Service (Arbeitsmarktservice or 'AMS') to predict a job seeker’s employment prospects and allocate appropriate forms of support to them.

The system works by automatically classifying job seekers and calculating individual 'IC' scores based on their gender, age, citizenship, education, health, care obligation and work experience, amongst other factors, to determine their relative employability. 

It then assigns them to one of three possible prospective employability groups - A (High), B (Medium), or C (Low) - though job seeker scores can be petitioned against and overriden by human case workers.

System 🤖

System databank 🔢

Operator: Public Employment Service (AMS)
Developer: Synthesis Forschung; Public Employment Service (AMS)
Country: Austria

Sector: Govt - employment

Purpose: Assess employability

Technology: Prediction algorithm
Issue: Bias/discrimination - gender, disability, age, location

Transparency: Black box

Risks and harms 🛑

Discrimination, structural prejudices

Academics and civil rights groups found that the algorithm gives lower scores to women over 30, women with childcare obligations, migrants, or people with disabilities, placing them in lower categories even if they had the same qualifications as men or non-disabled people. By contrast, men with children are not negatively weighted.

It is also seen to discriminate against people living in areas of the country where unemployment rates tend to be high, thereby reinforcing structural prejudices or stereotypes. In addition, the system ignores important factors when taking into account someone's employability - for example, it fails to capture or analyse soft skills and motivations in a quantifiable manner. 

In response the AMS contended that the results allow it to better understand the population and its abilities so that it can better target its support. ‘Building an accurate picture of what is our reality cannot in itself be called discriminatory,’ it argued.

Transparency 🙈

The AMS has promised to make its algorithmic system transparent and accountable. 

But researchers have shown that definitions, data collection and management practices, and information about its models, are either overly technical to the point of incomprehensible, missing, or lack detail. As AlgorithmWatch pointed out, it has only released two of the 96 statistical models claimed to be used to assess job seekers. 

Furthermore, job seekers are provided very little information to understand how the system works, and find it very difficult to challenge their classifications and scores.

Research, advocacy 🧮