Occurred: April 2019
Researchers from Tencent's Keen Security Lab have adversarily tricked a Tesla Model S into driving into a lane of oncoming traffic.
The exploit worked by using small, inconspicuous stickers that tricked the car's Enhanced Autopilot into detecting and then following a change in the current lane, raising questions about the safety and efficacy of Tesla's driver assistance system.
As Technology Review's Karen Hao noted, 'Tesla’s Autopilot is vulnerable because it recognizes lanes using computer vision. In other words, the system relies on camera data, analyzed by a neural network, to tell the vehicle how to keep centered within its lane.'
Tesla welcomed the attack, adding that the adversarial attack was unrealistic 'given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so, and can manually operate the windshield wiper settings at all times.'
Operator: Keen Security Lab
Purpose: Automate steering, acceleration, braking
Technology: Driver assistance system
Issue: Accuracy/reliability; Safety
Transparency: Black box
News, commentary, analysis
Published: March 2023