Tesla Autopilot, Full-Self Driving
Tesla Autopilot, Full-Self Driving
Page published: June 2023 | Last updated: April 2026
Autopilot is an 'Advanced Driver Assistance System' (ADAS) designed, developed and managed by Tesla. It is classified as an SAE Level 2 system, in which a car is able to act autonomously but requires the driver to monitor the driving at all times and be prepared to take control at a moment's notice.
Launched in 2014, Autopilot was initially limited to automatic parking and the ability to summon a car on private property. Automatic emergency braking, adaptive cruise control and lane centering capabilities were introduced in 2016.
Tesla's Full-Self Driving (FSD) capability is an enhanced version of Autopilot that offers users a suite of more advanced features, including adaptive cruise control, automatic steering, automatic lane changing, auto park, traffic light recognition, stop sign recognition, and the ability to summon a car from a parking spot or garage.
A beta version of FSD was offered to a small group of users in the US in October 2020, expanding in May 2021 to a few thousand employees and customers and in October 2021 to Tesla drivers scoring full marks on a proprietary safety scoring system.
Website: Autopilot, Full Self-Driving 🔗
Developer: Tesla
Purpose: Automate steering, acceleration, braking
Type: Driver assistance system; Self-driving system
Technique: Computer vision; Machine learning; Neural network
Tesla. Customer privacy notice
Tesla. Legal resources
Tesla. Terms of use
Tesla's Autopilot and Full-Self Driving systems are seen to exhibit multiple transparency and accountability limitations:
Algorithmic black box. There is limited public information on how Autopilot and FSD function, leaving users uncertain about the underlying mechanisms and making it difficult for users to understand how decisions are made
Privacy. While Tesla claims to protect privacy, the extent of data collection and usage, particularly regarding camera recordings and vehicle data, remains unclear.
Public release of crash data. Tesla actively seeks to block or limit the release of detailed crash data related to its Autopilot system, including information on hardware/software versions, road conditions, and driver engagement at the time. This restricts regulators, researchers and the public from assessing real-world system performance and failure patterns.
Liability. Recent court verdicts where Tesla was found partially liable for crashes involving Autopilot highlight accountability issues. These cases often rely on data exposed only through legal or unauthorised channels, not voluntarily shared by Tesla.
Third-party audits. Tesla actively resists third-party audits or independent validation of Autopilot's performance and safety standards.
Regulatory oversight. Tesla has significant autonomy in determining software updates, potentially prioritising customer satisfaction over safety restrictions. The National Highway Traffic Safety Administration (NHTSA) has had to request detailed information about software updates and their impacts, indicating a lack of proactive disclosure.
Marketing. Tesla promotes Autopilot and FSD features as advanced, often leading to driver and public misconceptions about their true capabilities and autonomy levels. This contrast between rhetoric and technical reality hampers informed user understanding, leading to crashes and other incidents, and creates accountability gaps.
Quora 🔗
Reddit 🔗
TeslaMotors subbreddit (unofficial) 🔗
RealTesla subbreddit (unofficial) 🔗
Tesla's Autopilot and "Full Self-Driving" (FSD) systems have been linked to hundreds of crashes, many fatalities, and a host of other known harms, including:
Loss of life
Bodily injury
Over-reliance
Property damage
Privacy loss
Distress/anxiety
Psychological trauma
Operational disruption
Increased total road traffic
Depletion of rare mineral resources
Electronic waste
Excessive energy consumption
Excessive water consumption.
These harms have led to multiple vehicle recalls; U.S. federal "Engineering Analysis" probes; U.S. federal criminal investigations into "self-driving" claims; and multiple wrongful death lawsuits.
Tesla Autopilot and FSD are verified as partly or fully responsible for 59 deaths (source: Tesla Deaths), including:
October 2025. Florida. Tesla stops and hit by semi-truck, killing Tesla driver
August 2025. Tesla Cybertruck with FSD activated attempts to drive off Houston overpass
April 2025. Ventura, California. Motorcylist killed after colliding with Tesla on Autopilot
April 2024. Snohomish County, Washington. Tesla with FSD activated crashes into rear of motorcycle, kills rider
November 2023. Flagstaff, Arizona. Tesla with FSD activated hits and kills pedestrian in Arizona
September 2023. Idaho. Tesla on Autopilot collides with semi-truck, killing four occupants and dog
July 2023. South Lake Tahoe, California. Tesla on Autopilot collides with Subaru Impreza, kills two
July 2023. Warrenton, Virginia. Tesla on Autopilot crashes into tractor-trailer, killing driver
July 2023. Opal, Virginia. Tesla on Autopilot crashes under tractor-trailer, killing driver
June 2023. Brooklyn, New York. Tesla on Autopilot kills pedestrian waiting on Brooklyn sidewalk
June 2023. Central Point, Oregon. Tesla on Autopilot drifts off road and hits tree, killing driver
March 2023. Corona, California. Tesla on Autopilot collides with Ford Pick-up truck, killing driver
February 2023. Walnut Creek, California. Tesla on Autopilot crashes into parked fire truck, killing driver
August 2022. Boca Raton, Florida. Tesla on Autopilot rear-ends Kawasaki motorcycle, kills rider
July 2022. Riverside, California. Tesla on Autopilot rear-ends Yamaha motorcycle, kills rider
July 2022. Draper, Utah. Tesla on Autopilot rear-ends Harley-Davidson, kills rider
July 2022. Gainsville, Florida. Two killed as Tesla on Autopilot crashes into parked tractor-trailer
May 2022. Mission Viejo, California. Tesla on Autopilot strikes and kills pedestrian in highway construction zone
May 2022. Evergreen, Colorado. Drunk driver using Tesla FSD killed after car hits tree
July 2021. New York. Tesla on Autopilot kills New York man changing tyre on expressway
May 2021. Fontana, California. Tesla on Autopilot strikes over-turned truck, kills driver
September 2020. Marietta, Georgia. Tesla on Autopilot kills pedestrian in bus shelter
May 2020. Arendal, Norway. Tesla on Autopilot kills truck driver standing on road
December 2019. Gardena, Los Angeles. Tesla on Autopilot runs red light, kills two
December 2019. Cloverdale, Indiana. Tesla on Autopilot rear-ends fire truck, driver and passenger killed
August 2019. Fremont, California. Tesla on Autopilot rear-ends Ford pick-up, kills passenger
April 2019. Key Largo, Florida. Tesla hid data about Autopilot crash that killed Florida couple, fined USD 423 million
March 2019. Delray Beach, Florida. Tesla on Autopilot crashes into 18-wheeler truck, kills owner
April 2018. Kangawa, Japan. Tesla on Autopilot kills pedestrian outside Tokyo
March 2018. Mountain Valley, California. Tesla on Autopilot veers off highway into concrete barrier, kills driver
May 2016. Williston, Florida. Tesla on Autopilot collides with tractor-trailor, kills driver
January 2016. Heibei China. Tesla on Autopilot crashes into road-sweeper, kills driver
Saint Amour v. Tesla. A case involving a Cybertruck owner in Texas who claims her vehicle, while on FSD, attempted to drive off a Houston overpass. The lawsuit specifically targets Elon Musk’s personal influence on engineering, alleging he ignored warnings from his own engineers to include LiDAR technology. It characterizes Musk as an "aggressive and irresponsible salesman" whose design choices prioritize cost-cutting over safety.
Benavides v Tesla. A landmark verdict was reached regarding a 2019 fatal Autopilot crash. A jury found Tesla 33% liable for the accident, awarding USD 243 million in damages. The jury agreed with the plaintiff's argument that the public had been made part of a "beta test they never signed up for."
California DMV v Tesla. An administrative judge ruled that the names "Autopilot" and "Full Self-Driving" were "unambiguously false and counterfactual."
Huang v Tesla. Tesla settled the case days before it was set to go to trial. By settling, the company avoided a public discovery process that might have revealed internal emails regarding known software flaws. Critics and rival lawyers frequently cite this settlement as proof that Tesla is "scared" of a jury seeing its internal data.
AIAAIC Repository ID: AIAAIC1038