Tesla Autopilot, Full-Self Driving
Autopilot is a so-called 'Advanced Driver Assistance System' (ADAS) designed, developed, and managed by Tesla. Launched in 2014, Autopilot was initially limited to automatic parking and the ability to summon a car on private property. In 2016 automatic emergency braking, adaptive cruise control, and lane centering capabilities were introduced.
Autopilot is classified as an SAE Level 2 system, in which a car can act autonomously, but requires the driver to monitor the driving at all times and be prepared to take control at a moment's notice.
Tesla's Full-Self Driving (FSD) capability is an enhanced version of Autopilot that was launched in 2016 and offers users a suite of more advanced features, including adaptive cruise control, automatic steering, automatic lane changing, auto park, traffic light recognition, stop sign recognition, and the ability to summon a car from a parking spot or garage.
A beta version of FSD was offered to a small group of users in the US, expanding in May 2021 to a few thousand employees and customers and in October 2021 to Tesla drivers scoring full marks on a proprietary safety scoring system.
The safety of Autopilot and FSD and their impact on public safety have been regularly questioned by policy-makers, regulators, customers, and others.
In October 2022, the US Securities and Exchange Commission (SEC) opened a civil investigation into whether Tesla had been misleading investors about the safety of its Autopilot system, according to The Wall Street Journal.
A 2021 study of three Tesla Model 3 cars discovered (pdf) they exhibited 'significant between and within vehicle variation on a number of metrics related to driver monitoring, alerting, and safe operation of the underlying autonomy... suggest[ing] that the performance of the underlying artificial intelligence and computer vision systems was extremely variable.'
Tesla owners have also complained about the 'phantom braking' of their vehicles, an issue traced by the Washington Post to Tesla's decision to stop using radar sensors in new vehicles in order to move to its Tesla Vision camera-based system, and to an update to its Full-Self Driving beta programme.
Autopilot and FSD are considered likely to have been involved in and/or responsible for the following fatal accidents:
An April 2023 Reuters report alleged Tesla employees had been privately sharing private videos images and videos recorded by its customers' car cameras on its internal messaging system and with third party suppliers in order to improve Tesla's computer vision machine learning systems.
A 2022/23 investigation by the Netherland's Autoriteit Persoonsgegevens (Dutch Data Protection Authority) ruled Tesla had violated the privacy of people coming close to its cars. Tesla's Sentry Mode uses four cameras continuously filming everything around a parked vehicle to protect it against theft and vandalism, with images saved for one hour in the car itself.
Tesla and its CEO Elon Musk have been regularly dogged by accusations of systematically over-stating Autopilot and Full-Self Driving (FSD) capabilities, and under-stating their role in accidents - allegations have resulted in multiple complaints, investigations, and lawsuits against Tesla.
Country: USA; Global
Purpose: Automate steering, acceleration, braking
Technology: Driver assistance system
Issue: Governance; Accuracy/reliability; Privacy; Safety
Transparency: Governance; Black box; Marketing; Legal
Cummings M.L., Bauchwitz B. (2021). Safety Implications of Variability in Autonomous Driving Assist Alerting (pdf)
Investigations, assessments, audits
Published: June 2023
Last updated: November 2023