Waymo sued after cyclist is doored by robotaxi passenger
Waymo sued after cyclist is doored by robotaxi passenger
Occurred: February 2025
Page published: November 2025
Waymo is being sued after a San Francisco cyclist says she was “doored” by a passenger exiting a driverless Waymo robotaxi that had pulled into a bike lane, leaving her with serious brain and spine injuries and highlighting gaps in how autonomous vehicles protect people on bikes.
Cyclist Jenifer Hanki was riding in a marked bike lane on 7th Street, San Francisco, when a driverless Waymo robotaxi stopped near the curb, reportedly in a "No Stopping" zone, to discharge passengers. According to the subsequent lawsuit, a passenger suddenly opened the rear door on the curbside (facing the bike lane), resulting in a "dooring" collision that struck the cyclist.
The impact ejected Hanki from her bicycle, throwing her into the adjacent traffic lane, where she then collided with a second autonomous Waymo vehicle that was allegedly also obstructing the bicycle lane. Hanki was hospitalised with serious bodily injuries, including brain, spine, and soft tissue damage, which reportedly left her unable to work or cycle.
The Waymo passengers reportedly left the scene after shrugging and expressing confusion, and the company allegedly declined to identify or disclose their identities, causing the victim additional distress and uncertainty in seeking redress.
Hanki later sued Waymo and its owner Alphabet for negligence, defective product liability, battery, and emotional distress.
The incident appears to stem from insufficient design and accountability mechanisms for managing passenger behaviour in driverless vehicles.
Unlike human drivers, an autonomous system cannot verbally warn passengers, perform shoulder checks, or enforce safe exit behaviour.
Any failure to implement door-opening sensors, exit-side alerts, or geofenced safe-dropoff logic increases the risk of dooring.
The gap between human-centric road norms and AV-centric system design remains a known risk factor.
For the cyclist: The incident resulted in severe physical injury, emotional distress, lost income, and the daunting challenge of navigating a complex legal battle against a major technology corporation (Alphabet/Waymo) to seek compensation and accountability.
For road users and the general public: It underlines the immediate, real-world danger posed by AV operational errors and inadequate safety features, particularly concerning established urban hazards like 'dooring.' It demands that autonomous driving companies design for the worst-case human passenger behavior and strictly adhere to traffic laws, especially regarding dedicated lanes.
For the autonomous vehicle industry: This case is a crucial test of AV liability. It forces a legal determination on whether the AV operator is responsible for:
The vehicle's choice of an illegal/unsafe drop-off location.
The failure of its passenger-facing safety alert system.
The post-crash conduct of its unmanned service.
If Waymo is held liable, it could set a powerful precedent for stricter operational standards, mandatory system-level safeguards (like automatically locking doors if an obstacle is detected), and enhanced corporate accountability, which will shape the future of urban AV deployment.
Developer: Waymo
Country: USA
Sector: Automotive
Purpose: Automate steering, acceleration, braking
Technology: Self-driving system
Issue: Accountability; Liability; Safety; Transparency
California Vehicle Code
AIAAIC Repository ID: AIAAIC2144