December 12th, 2022

The New Reality of Civil and/or Criminal Accident Liability While Using EV Autopilot

Posted in:      Tagged:

Author: Fred A. Balkin

AdobeStock_321540535_EV_Resized

Can You Be Held Responsible For An Accident Which Occurred When You Were Not Physically Driving?

For many years now autopilot has been talked about and sold to the public as the future of automobile travel, both for private passenger vehicles as well as commercial vehicles. At face value, it certainly is intriguing to the average consumer and appeals to many of us, who dream about the day we can just get into the back seat of our car and instruct it to take us wherever we desire while we read a book or even take a nap.

Commercially, the thought of not needing truck drivers for example seems like a way to substantially increase profits for trucking/shipping companies. Further, for rideshare companies and taxi companies, taking the human error potential out of driving and replacing it with computer technology, which theoretically does not make mistakes or become distracted, is also very intriguing.

So naturally when EV companies like Tesla started offering included autopilot with the purchase of a car, “enhanced autopilot” for $6,000, or better yet “full self-driving” capability for $15,000, all instantly obtainable through their mobile application if you did not initially purchase it when ordering your car, many Tesla owners jumped at the offers. This was especially true when they found out they could subscribe to “full self-driving capability” for $199.99 a month instead of paying the full price of the feature up front. Who would not want to be driven around by your car for an extra $200 a month, as opposed to hiring a driver for thousands of dollars a month?  

Initially, it is important to note that it is estimated there are over 750,000 Tesla cars equipped with some form of autopilot. Many Tesla owners that use this feature assume that autopilot means that you do not have to sit in the driver’s seat, pay attention and be ready to take control of the car at any moment. In fact, some drivers have been spotted sitting in the back seat and letting the car drive them.  On the contrary, Tesla does state in its vehicle manuals that “it is the driver’s responsibility to stay alert, drive safely and be in control of the vehicle at all times,” and provides a list of 10 conditions, with a note that there may be more, which can hinder the full self-driving capability of the vehicle. 

So, with the above noted in the vehicle manual, as well as in other portions of the vehicle programming, what happens when an accident occurs that is clearly caused by the EV vehicle while it is in self driving mode, and involves the serious injury and/or death of others? The legal landscape has shifted significantly as these cases have moved through the courts. In the landmark Benavides v. Tesla case stemming from a 2019 Florida Keys crash, a Miami jury awarded more than $243 million in damages, including $200 million in punitive damages, after finding Tesla’s Autopilot system defective and partially responsible for the death of a pedestrian and injuries to another; a federal judge upheld that verdict in early 2026, marking the first major plaintiff victory in an Autopilot wrongful death suit. Separately, the driver in a 2019 Gardena, California incident, where a Tesla on Autopilot ran a red light at high speed and killed two people, later pleaded no contest to vehicular manslaughter and received probation and restitution, confirming that criminal liability for drivers who over-rely on self-driving systems is a real and enforceable consequence. These two outcomes together, one civil, one criminal, illustrate that both the manufacturer and the driver can face serious legal accountability when autopilot systems are involved in fatal crashes, and they are already shaping how future cases involving self-driving technology will be litigated across the country.