Al system directs Amazon delivery van on to "dangerous" mudflats
Al system directs Amazon delivery van on to "dangerous" mudflats
Occurred: February 2026
Page published: February 2026
An Amazon delivery driver using a proprietary AI-powered routing system drove onto a notoriously dangerous 600-year-old tidal path in Essex, UK, stranding their vehicle and resulting in a high-risk rescue operation.
A driver working for an Amazon delivery service tried to reach Foulness Island and was routed by a AI-powered, GPS-based routing system onto The Broomway, which is an ancient, unpaved tidal track across the Thames Estuary that is not intended for vehicles and quickly becomes hazardous as tides change.
The van subsequently became bogged down in deep mud and water on the flats, with the tide beginning to rise around it. The driver exited safely, but the vehicle was left stranded until a local farmer later recovered it.
The pathway is known locally as extremely dangerous, historically causing numerous drownings among people unfamiliar with its nature.
The incident can be attributed to a failure of algorithmic oversight and corporate accountability.
Amazon’s proprietary Flex navigation system failed to account for the physical reality of the Broomway, which is a mudflat, not a road, and did not integrate real-time tidal data or environmental hazards.
Reports from Amazon drivers suggest a culture of "following the app at all costs." In this case, the driver likely prioritised meeting delivery quotas over personal judgement, or lacked the local knowledge to override the digital instruction.
Reports indicate that when the driver reached the sea wall and questioned the route, her human supervisor, who was also relying on the same digital interface, simply asked, "Does the satnav tell you to go that way?"
When she said yes, the supervisor told her to proceed - a classic example of automation bias, where humans defer to the AI's "judgement" as the ultimate source of truth, even in the face of obvious physical risk.
Amazon’s reliance on independent Delivery Service Providers (DSPs) creates a "responsibility gap." By classifying drivers as contractors, the company can distance itself from the logistical failures of the algorithms that actually dictate the drivers' movements.
For society: The incident serves as a stark warning of "automation bias," where users trust digital systems even when they contradict physical warnings (such as driving into the sea).
For industry: It underlines the necessity for "Human-in-the-Loop" safeguards, ensuring drivers have the authority, and the safety from retaliation, to reject dangerous automated routes.
For policymakers: It highlights the need for stricter mapping standards for logistics giants. Regulators may need to mandate that commercial navigation systems exclude "pedestrian-only" or "hazardous seasonal" routes from motorised vehicle systems.
Amazon Flex
Developer: Amazon
Country: UK
Sector: Transport/logistics
Purpose: Calculate route efficiency
Technology: Routing algorithm
Issue: Accuracy/reliability; Accountability; Automation bias
AIAAIC Repository ID: AIAAIC2221