Amazon Mentor delivery driver scoring criticised as inaccurate, invasive
Amazon Mentor delivery driver scoring criticised as inaccurate, invasive
Occurred: February 2021-
Page published: February 2021 | Last updated: June 2024
Amazon’s reliance on the "Mentor" app for delivery driver scoring faced intense criticism for creating a "punishment loop" where inaccurate AI-driven penalties and invasive surveillance conflict with impossible delivery quotas, ultimately compromising worker safety.
Developed by eDriving, the Mentor app promises to improve driver safety by generating a 'FICO' score each day that measured their driving performance. It also provides micro-training modules to the driver.
Amazon mandated that its third-party Delivery Service Partner (DSP) drivers use Mentor Beginning around 2017 and intensifying during the COVID-19 pandemic.
However, Amazon drivers complain the app is often inaccurate and report being docked points for using a phone while driving even when they do not answer a ringing phone. Another driver was flagged for distracted driving at every delivery stop she made.
Drivers also complained the app led to unfair disciplinary action, including the loss of restricted payouts, bonuses and perks, as well as jobs.
Mentor was also seen as unnecessarily invasive, tracking drivers' location after they clocked out from work, according to a report by CNBC.
The incident is rooted in a fundamental transparency and accountability gap within Amazon's "arm's length" employment model.
Conflicting incentives: While Amazon publicly championed Mentor as a safety tool, its internal routing algorithms set delivery quotas so high that drivers often had to speed or bypass safety protocols to finish their shifts.
Lack of recourse: The "black box" nature of the FICO algorithm meant drivers had no meaningful way to appeal false positives. Amazon effectively outsourced the "managerial" role to an algorithm, allowing it to maintain control over standards without the legal liability of direct employment.
Corporate shielding: By using third-party DSPs, Amazon could distance itself from the "gaming of the system" that occurred when managers - pressured by Amazon’s own quotas - told drivers to turn off the app mid-shift to hide the reckless driving required to meet targets.
For directly impacted drivers, the system transformed the workplace into a high-stress environment where human judgment was secondary to an unforgiving and often wrong digital score. This led to "algorithmic burnout" and physical risks, as drivers focused more on "pleasing the app" than on actual road conditions.
For society, the Amazon Mentor case serves as a warning about the "Uber-isation" of safety. It demonstrates how AI can be used to perform "automated management," where surveillance is rebranded as "coaching" to circumvent labour laws. It also highlights a public safety paradox: a system designed to make roads safer may actually make them more dangerous by incentivising drivers to prioritise an arbitrary digital metric over real-world situational awareness.
Developer: eDriving
Country: USA
Sector: Transport/logistics
Purpose: Assess delivery driver performance
Technology: Performance scoring algorithm
Issue: Accountability; Accuracy/reliability; Fairness; Privacy/surveillance; Transparency
https://www.cnbc.com/2021/02/12/amazon-mentor-app-tracks-and-disciplines-delivery-drivers.html
https://mashable.com/article/amazon-mentor-delivery-driver-monitoring-app
https://www.wired.com/story/some-amazon-drivers-have-had-enough-can-they-unionize/
https://www.reddit.com/r/AmazonDSPDrivers/comments/gydrct/mentor_has_many_flaws/
AIAAIC Repository ID: AIAAIC0754