by Wilton H. Strickland
A Miami jury recently found that Tesla shares liability to the tune of $243 million for a collision that injured one pedestrian and killed another. The collision occurred when the driver was using his Tesla’s autopilot but failed to stay alert, apparently on the belief that the autopilot is 100% autonomous. This was despite the fact that Tesla’s instructions and warnings specifically state that a driver must stay alert while using the autopilot because it is not 100% autonomous. Apparently, the word “autopilot” is so confusing that it erased the import of these instructions and warnings, par for the course with a Florida jury.
I wish this verdict had been delivered just two months earlier, for it would have helped me prepare the opposition to a driver’s motion for summary judgment in a similar case. The driver argued that he had transformed into a mere passenger upon activating his Tesla’s autopilot, meaning that he could not be held liable at all for drifting out of his lane and colliding with another vehicle. One of my opposition arguments was that the driver should have filed a third-party claim against Tesla, a very deep pocket, if he honestly believed his theory (he had not done so). The Florida verdict would have helped me show not only the feasibility of this approach, but also that a driver still bears at least some responsibility for failing to stay alert while using autopilot.
Lacking the benefit of the Florida verdict, I still managed to make arguments that could prove useful to other people injured by drivers who are attempting to deactivate their liability with the flip of a switch. As technology continues improving and proliferating across all aspects of modern life, the temptation to blame it for one’s own negligence will surely grow, so here are some possible grounds to prevent that from happening in the autopilot context:
- First, it is important to scrutinize the driver’s conduct before activating the autopilot. In my case, the driver had been consuming alcohol just before entering his vehicle; the weather conditions were dark, wet, and snowy; the driver recalled instances when the autopilot had allowed the vehicle to drift when the road lines were not clear; and he had not read the vehicle’s safety guidelines. Under these circumstances, he bore at least some responsibility and could not blame autopilot for everything that happened after activating it.
- Second, it is important to scrutinize the driver’s conduct while the autopilot is activated. In my case, the driver admittedly tried to stay alert and kept his hands near the steering wheel, grabbing it shortly before the collision in an effort to steer to safety. Under these circumstances, he could not credibly assert his belief that the autopilot was 100% autonomous.
- Third, it is important to scrutinize the driver’s conduct after the incident. Did the driver express remorse or otherwise demonstrate an awareness of fault? If so, that defeats the notion that the autopilot is entirely to blame.
- Fourth, there is no case law (yet) holding that the driver of a Tesla or similar vehicle sheds all of his responsibilities and becomes a mere passenger upon activating the autopilot. Such a result would be unprecedented and dangerous for reasons of public policy, as it would encourage drivers to do less rather than more to protect people’s safety.
- Fifth, every state has statutes and case law establishing a driver’s duties that cannot be ignored merely because of a newfangled technology. In my case (from Montana), there is case law holding that “[a] driver has a duty to keep a look out, to keep the vehicle under control, to operate the vehicle at a reasonable speed, and to not drive if under the influence of alcohol.” Buxbaum v. Trustees of Ind. Univ., No. CDV-2000-31, 2002 Mont. Dist. LEXIS 2020, at **4 (July 17, 2002) (citing Buck v. State, 222 Mont. 423, 430, 723 P.2d 210, 214 (1986)). There is no exception for autopilot.
- Sixth, many states do not allow a defendant to apportion fault to an absent third party. If you are in such a jurisdiction and the driver is pointing a finger at Tesla, the driver should either implead Tesla or abandon this theory, as I argued in my case.
- Seventh, even in the states that allow a defendant to blame an absent third party in whole or in part for causing harm, causation is typically a factual question reserved for a jury, so this argument should not be allowed to support a driver’s motion for summary judgment.
- Eighth, even if a Tesla driver can switch off his ordinary duties by activating the autopilot, he likely still can be held responsible for negligently performing other duties that he voluntarily assumed while occupying the vehicle. In my case, again, the driver admittedly attempted to monitor the Tesla and to avoid the crash by grabbing the steering wheel. Such efforts created a new legal duty that bound him to act in a careful manner. See Lokey v. Breuner, 2010 MT 216, ¶ 10, 358 Mont. 8, 243 P.3d 384 (“This Court has recognized and adopted the ‘long-standing principle of tort law that “one who assumes to act, even though gratuitously, may thereby become subject to the duty of acting carefully, if he acts at all.”’”) (citations omitted).
In short, the temptation to avoid responsibility and to blame others for one’s own failings will surely continue to grow as technology does, making life more difficult for people who are seeking relief in court. The Florida verdict against Tesla will encourage this unhealthy trend, but the good news is that the driver was also held partially responsible and could not claim to be blameless when negligently operating a vehicle on the public roads.



