Deliveroo Italy algorithm ruled to discriminate against "reliable" riders
Deliveroo Italy algorithm ruled to discriminate against "reliable" riders
Occurred: January 2021
Page published: January 2021 | Last updated: March 2026
An Italian court ruled that Deliveroo’s "Frank" algorithm was discriminatory because it penalized riders for missing shifts regardless of the reason, thereby unfairly restricting future work opportunities for vulnerable groups and hindering fundamental employment rights.
The Labour Court of Bologna ruled that Deliveroo Italy used a discriminatory booking system comprising an algorithm called Frank to assign "reliability" and "participation" scores to riders.
Riders who failed to cancel a shift at least 24 hours in advance or failed to log in within 15 minutes of a shift start saw their scores drop.
Because riders with higher scores were given first pick of the most profitable shifts, those who missed work, even for legally protected reasons like emergencies, illness, or participating in a union strike, were systematically pushed to the back of the queue, leading to a significant loss of future income.
Deliveroo, which says it no longer uses the algorithm, was ordered to pay EUR 50,000 to every affected rider.
The company had previously claimed the algorithm cut delivery times by 20 percent.
The root cause was the algorithm's contextual blindness. Deliveroo designed the system to prioritise efficiency and "reliability" above all else, intentionally choosing not to distinguish between a rider who skipped work for a "trivial" reason and one who was legally entitled to be absent.
The court noted that Deliveroo could have programmed the algorithm to recognise legitimate absences (as it already did for insurance-covered accidents), but chose not to.
By hiding behind the "neutrality" of math, the company attempted to bypass traditional labour protections, treating human workers like interchangeable components in a machine-led logistics system.
For workers, it establishes a legal precedent that gig workers have a right to "non-discriminatory" algorithms. It protects their right to strike and take sick leave without fear of being "shadow-banned" or de-prioritised by software.
For society and policymakers, the ruling challenges the idea that algorithms are objective or neutral. It signals to tech companies that they are legally liable for the "unintended" discriminatory outcomes of their code and encourages regulators to demand greater transparency into "black-box" management systems.
Frank
Developer: Deliveroo
Country: Italy
Sector: Transport/logistics
Purpose: Determine rider reliability
Technology: Workforce management system
Issue: Accountability; Employment/labour; Fairness; Transparency
AIAAIC Repository ID: AIAAIC0504