Photo: Asianet Newsable
Tesla is facing one of its most consequential legal battles to date, as a federal jury in Florida considers whether the company’s Autopilot system was responsible for a fatal 2019 crash in Key Largo. The case marks the first federal jury trial involving a death allegedly caused by Tesla’s driver-assistance technology — and the plaintiffs are seeking $345 million in damages, including $109 million in compensatory damages and $236 million in punitive damages.
The trial, which began on July 14 in the Southern District of Florida, is seen as a key test of legal liability in the age of semi-autonomous vehicles, and could have far-reaching implications for Tesla and the broader self-driving car industry.
The case revolves around a crash that occurred in March 2019, when George McGee, the driver of a Tesla Model S, was using Tesla’s Enhanced Autopilot system while driving through Key Largo. McGee admitted during the trial that he dropped his phone while driving and leaned over to pick it up, assuming Autopilot would handle the road and brake for any obstacles.
Instead, the vehicle accelerated through an intersection at over 60 miles per hour, crashing into a parked car and fatally striking 22-year-old Naibel Benavides, who was standing near the vehicle with her boyfriend Dillon Angulo. Benavides died on the scene, her body found nearly 75 feet from the point of impact. Angulo survived, but sustained multiple broken bones, a traumatic brain injury, and lasting emotional trauma.
The plaintiffs include Benavides’ family and Angulo himself, who testified in court. Angulo is seeking compensation for medical costs and emotional suffering, while the Benavides estate is pursuing a wrongful death claim and substantial punitive damages.
Plaintiffs’ attorneys argue that Tesla’s Autopilot system was defective and unsafe, and that it was misrepresented to consumers by both Elon Musk and the company’s marketing materials. They cited years of public statements made by Musk claiming the safety and near-autonomous capabilities of Tesla vehicles.
Legal counsel for the plaintiffs also contended that Autopilot should only be used on limited-access highways and that Tesla failed to restrict usage to roads where its functionality was safe and reliable.
Further, the attorneys highlighted that Tesla's branding and communication often gave drivers a false sense of security, causing over-reliance on the technology.
Tesla has denied any wrongdoing and claims the crash was due to human error, not flaws in Autopilot.
In closing arguments, Tesla attorneys emphasized that the company clearly instructs customers on how to use Autopilot safely and warned that the system requires active driver supervision. They insisted that McGee’s distraction was the primary cause of the crash and that penalizing Tesla would discourage companies from developing life-saving technologies.
Tesla also pointed out that the driver, George McGee, had already faced legal consequences. He was charged with careless driving in October 2019 and did not contest the charge. The Benavides family had previously reached a private settlement with McGee.
Tesla typically handles Autopilot-related cases out of court or via private arbitration. However, Judge Beth Bloom allowed this case to proceed to trial, writing in an early July ruling that “a reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit.”
This makes the case especially significant. If the jury rules against Tesla, it could open the door for future lawsuits, raise regulatory scrutiny, and pressure automakers to rethink their marketing and safety testing for advanced driver-assistance systems.
Tesla's Autopilot has been under multiple investigations by the National Highway Traffic Safety Administration (NHTSA) and National Transportation Safety Board (NTSB) over a series of crashes — including over a dozen fatal accidents in the U.S. alone since 2016. As of mid-2025, the NHTSA has linked 17 deaths to potential misuse or failure of Tesla’s semi-autonomous features.
Critics argue that Tesla's use of terms like "Autopilot" and "Full Self-Driving" misleads consumers into thinking the cars are more autonomous than they actually are. Tesla maintains that these systems improve safety when used properly and that data from its fleet shows lower crash rates per mile compared to human-driven cars.
During closing arguments, members of the Benavides family and Angulo were present in the courtroom. They turned away as graphic videos and crash site photos were shown to the jury. The emotional weight of the testimony underscored the human cost at the center of this case.
The jury’s decision could mark a turning point in how courts — and eventually regulators — view corporate responsibility for AI-powered vehicle systems.
This trial doesn’t just put Tesla’s Autopilot on trial — it challenges the automotive industry's entire narrative around driver-assist technologies. A decision against Tesla could lead to increased oversight, transparency, and accountability in the race toward automation.
Whether Tesla wins or loses, the outcome is likely to echo far beyond one courtroom in Florida.