Waymo robotaxi fails to stop for school bus
Waymo robotaxi fails to stop for school bus
Occurred: November 2025
Page published: November 2025
Report incidentπ₯| Improve page π| Access database π’
A Waymo autonomous vehicle illegally drove around a stopped school bus with flashing lights and an extended stop arm while children were disembarking in Atlanta, Georgia, prompting a preliminary safety investigation by the US National Highway Traffic Safety Administration (NHTSA) into Waymo's entire fleet of robotaxis.Β
A Waymo robotaxi operating without a human safety driver failed to stop for a stationary school bus with its red lights flashing and both its stop arm and crossing control arm deployed - all legally mandated signals for vehicles to stop.
The Waymo vehicle, approaching from a perpendicular side street (a driveway, according to Waymo), initially stopped briefly but then proceeded to manoeuvre around the front of the bus.
In doing so, the autonomous vehicle passed the extended stop arm and the extended crossing control arm, driving near students who were actively disembarking from the bus.
While no injuries were reported, the action created a direct and potentially catastrophic safety risk for the children and violated traffic laws specifically designed to protect them, leading to a Preliminary Evaluation by the NHTSA's Office of Defects Investigation covering approximately 2,000 Waymo vehicles.
The incident points to a critical "edge case" failure in the Waymo's Fifth Generation Automated Driving System (ADS), raising questions about corporate transparency, safety validation, and the limits of current AI perception.
Accuracy/reliability: Waymo stated that the school bus was partially blocking the driveway the robotaxi was exiting, and from the vehicle's perspective, the flashing lights and stop sign were not visible due to the angle and obstruction. This suggests a deficiency in the AI's ability to interpret complex or obstructed real-world scenarios or a failure to default to the safest action (a complete stop) when critical information is ambiguous.
Accountability: The failure to correctly recognise and obey a stopped school bus is a fundamental, non-negotiable safety requirement. Reports suggest Waymo had addressed similar issues in the past, leading to concerns that this incident may represent a regression in the software's performance or a new, uncovered edge case.
Transparency: The incident was reportedly brought to the federal regulators' attention via a media report, not necessarily a voluntary public disclosure from the company, which highlights a systemic issue where the public and regulators often rely on external reports to identify safety critical defects in continuously updated autonomous systems. Furthermore, unlike human drivers who face tickets or license penalties, the question of who or what is responsible for the violation (the operator, the company, or the software) remains an ongoing legal and regulatory challenge. Waymo did state they have already implemented and plan further software updates.
The incident is a significant marker for the autonomous vehicle (AV) industry, shifting the focus from minor glitches to life-critical safety protocols.
For the general public: The most immediate impact is the potential for physical harm to children, and an erosion of public trust in autonomous technology.Β
For Waymo: The investigation means potential regulatory action for Waymo and increased scrutiny on the safety validation processes of all AV companies operating on public roads. For Waymo, it is their third major safety investigation by NHTSA in recent years, signaling a pattern of issues that must be addressed before mass adoption.
For policymakers/regulators: The incident underscores the current gap in AV regulation and governance. State traffic laws, like Georgia's "Addy's Law" mentioned in media reports, are designed for human drivers, making enforcement mechanisms, such as issuing a ticket or assigning fault, difficult when the driver is an AI. It reinforces the perception that public streets are being used as live testing grounds where safety validation is occurring through response to near-miss incidents rather than exhaustive pre-deployment verification. It poses a fundamental challenge to the promised safety benefits of AVs if they cannot reliably handle basic, life-saving traffic laws.
Waymo Driver π
Developer: Waymo
Country: USA
Sector: Automotive
Purpose: Automate steering, acceleration, braking
Technology: Self-driving system
Issue: Safety
AIAAIC Repository ID: AIAAIC2138