In California, a man has been charged with two counts of vehicular manslaughter after his car left the freeway at a high rate of speed, ran a red-light and struck another vehicle killing two people. After the accident, the National Highway Traffic Safety Administration (NHTSA) confirmed that Autopilot was in use in the Tesla at the time of the crash. The NHTSA has categorized these types of crashes as “automation complacency”. This raises the questions of: why are drivers being complacent and who is to blame?
First, as a driver of a motor vehicle, as a general rule, you are responsible for controlling your vehicle and you are responsible for your actions and inactions. Of course, in life and in the law, there are exceptions. For example, if the wheel comes off your vehicle and causes an accident, the company that serviced your tires may ultimately be the responsible party. Likewise, if your vehicle has a defect in it that causes an accident, the manufacturer of the vehicle may be responsible. And that last example will likely be the focus of the California driver’s criminal case.
A vehicle, or any product for that matter, can be defective for failing to warn of a danger. The California driver will likely defend the criminal charges by claiming he was using the Autopilot technology and thought it would prevent any accidents. He will likely focus on the fact that the technology is called AutoPilot, and the more advanced and more expensive technology is called Full Self Driving Capability. There will likely be lots of exhibits related to Tesla marketing of the technology, such as being able to “reduce your overall workload as a driver”. In response, prosecutors will respond with the admonitions by Tesla to drivers that it is their “responsibility to stay alert, drive safely and be in control of your car at all times” or that the vehicle requires “a fully attentive driver who has their hands on the wheel and is prepared to take over at any moment.”
And therein lies the rub: how does this technology “reduce your overall workload as a driver” if you must stay fully attentive with your hands on the wheel and be prepared to take over at any moment? Are the names “AutoPilot” and “Full Self Driving” misleading to drivers and contributing to automation complacency? Is this the classic “talking out of both sides of your mouth”?
Our advice is this: (1) if you are the owner of a vehicle with automated features of any type, use it wisely. If you are involved in an accident that injures or kills someone while using that technology, you may be held criminally responsible and you will almost certainly still be financially responsible for the harm that you cause. You may, in turn, have a product liability claim against the vehicle manufacturer (to try and hold the manufacturer responsible instead of you) but those cases are extremely difficult and expensive to pursue and we are not aware of any that have been successful; (2) on U.S. roadways, there are approximately 765,000 Tesla vehicles equipped with Autopilot and/or Full Self Driving (FSD) capability. And, of course, other manufacturers have some automated systems as well. If are involved in a serious crash with one of these vehicles and the other driver does not have sufficient insurance to fully compensate you, you want to be sure to have underinsured/uninsured motorist coverage. We write about this type of insurance coverage all the time, but every week we continue to see people who did not know about it and who wished they had it. To learn more, use the search bar on this website and read the plethora of articles about it.
As always, we are here if you need us. We offer a free, no-obligation consultation and our award-winning attorneys handle all accident cases on a contingency basis, so we only get paid if we recover money for you. We have three convenient Middle Tennessee locations, but handle cases anywhere in the State of Tennessee.