Tesla hides data about Autopilot crash that killed Florida couple
Tesla hides data about Autopilot crash that killed Florida couple
Occurred: April 2019
Page published: March 2023 | Page updated: September 2025
Tesla hid and denied possession of key electronic evidence from a fatal 2019 Autopilot crash in Florida, only admitting its existence after forensic experts and a hacker exposed its presence, leading to a landmark USD 243 million jury verdict against the company.
A 2019 Tesla Model S in Autopilot mode ran through a stop sign and flashing red stop light at a T-intersection, striking a parked Chevrolet Tahoe which spun and hit pedestrians Naibel Benavides Leon and Dillon Angulo, who were stargazing by the roadside in Key Largo.
Leon was killed, her boyfriend suffered serious injuries. Tesla driver George McGee later claimed that he had been bending over to grab a phone that he had dropped at the time of the incident, and that he thought Autopilot “would protect him and prevent a serious crash if he made a mistake."
Tesla immediately received a complete collision snapshot, including sensor data and video. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.
Tesla actively misdirected police and plaintiffs, systematically denying the data's existence for years, before hacker @greentheonly ultimately recovered the evidence, proving Tesla had always had access. The data showed the Autopilot system was engaged and failed to issue proper warnings or disengage despite a hazardous scenario.
Leon's family filed lawsuits against Tesla and George McGee. The latter was settled out of court. The lawsuit against Tesla alleges the company marketed a vehicle with "defective and unsafe characteristics, such as the failure to adequately determine stationary objects in front of the vehicle, which resulted in the death of [the victim]".
The trial revealed damaging transparency failures and led to Tesla being held 33 percent responsible and ordered to pay hundreds of millions in damages.
Tesla's actions appeared driven by a desire to avoid blame and shield its Autopilot system from scrutiny.
The company withheld and denied possession of crash data despite knowing it was automatically uploaded to its servers, then only admitted to its existence after third-party intervention.
The incident showed significant limitations in transparency and accountability, with Tesla controlling all access to the most crucial evidence.
Regulatory agencies also showed reluctance to force disclosure, raising further concerns about oversight of autonomous vehicle technology.
The incident exposes serious safety risks about Tesla's autonomous vehicle technologies, and raises concerns about the carmaker's lack of transparency and attempt to escape legal and reputational accountability.
The verdict is expected to empower other victims to seek trials, urge increased scrutiny of Tesla’s Autopilot claims, and prompt demands for stronger safety and transparency standards for AI-driven cars more widely.
Tesla asked that the verdict is overturned, arguing legal and technical points concerning responsibility and use of Autopilot.
Tesla Autopilot
Tesla Autopilot is an advanced driver-assistance system (ADAS) developed by Tesla that amounts to partial vehicle automation (Level 2 automation, as defined by SAE International).
Source: Wikipedia 🔗
Developer: Tesla
Country: USA
Sector: Autopilot
Purpose: Automate steering, acceleration, braking
Technology: Driver assistance system; Computer vision
Issue: Accountability; Accuracy/reliability; Safety; Transparency
NTSB investigation EA22002