Car Accidents Involving Autonomous Cars, Who is Liable?

Foley & Lardner LLP

With automated cars comes hopes of safer driving, more efficient commuting, increased productivity, reduced human errors, and fewer accidents. However, as self-driving cars becomes a reality, car accidents may lead to legal controversy over who is responsible for the accident; the manufacturer or the owner?

In September 2016, the National Highway Traffic Safety Administration (NHTSA) released a Federal Automated Vehicles Policy which adopted the Society of Automotive Engineers (“SAE”) classification levels for automation. The SAE developed a five point scale to classify levels of automation based on the required input of the human driver or occupant. The scale goes from level zero, denoting full human control, to level five, denoting fully autonomous vehicle requiring no human input. Cruise control is considered level one, while “partial automation”, including parking assist, lane departure warning, and automatic breaking,  is considered level two. Although Tesla’s “Autopilot” feature begins to break the threshold of level two  and move into level three,  it continues to require human input and monitoring to operate. Highly automated vehicles fall between levels three and five.

Despite accidents involving autonomous vehicles, this steady shift from human input towards partial and fully-autonomous operation creates a unique and complex legal question for not only the consumer, but fellow motorists, manufacturers, and their suppliers.

In March 2017, there was an accident involving a Tesla X and a Phoenix police officer.  According to USA Today, the accident barely qualified as a police report as there were no injuries or damages and the contact between the cars was merely a “tap” because the driver alleged that his Tesla was in auto-pilot mode. Similar incidents have been reported with Uber, Waymo/Google, and GM’s autonomous systems in recent months but the overwhelming majority of incidents have been the fault of the opposing driver. As the number of level 1-3 and ultimately level 1-5 autonomous vehicles become more ubiquitous on the roadways, questions have been raised on who will be liable if the vehicle is considered fully autonomous or in fully autonomous mode and is involved in an accident where the incident was the fault of the vehicle and not the passenger or human driver.

In May 2016, Tesla reported its first driver death while the Tesla Autopilot system was activated when a Tesla hit a tractor-trailer that was crossing a highway perpendicular to the flow of traffic. An investigation by NHTSA concluded that the driver had at least 7 seconds to respond and possibly mitigate or avoid the crash, which is longer than most drivers in similar situations, but that distraction likely caused the driver to be non-responsive to the hazard. An investigation of the Autopilot system installed at the time of the accident noted it was designed to avoid rear-end collisions but that the onboard radar was ineffective because it “tunes out what looks like an overhead road sign to avoid false braking events.” Although Tesla has provided extensive information to drivers to point out the Autopilot system requires “continual and full attention” while driving, several autonomous industry experts have been critical of Tesla’s roll-out for giving the public the impression the system is more autonomous and hands free than it is.

We’ve said it before, but as cars become more autonomous, liability may shift from the driver to car, and therefore, to the manufacturer of the vehicle and/or the supplier of the autonomous component system. But, at what point this liability transfers and by how much will be up for debate in states across the country over the coming years. As the shift in liability continues, it’ll also be interesting to see if, and how, a shift in litigating these matters might follow.

Historically individual states have been responsible for determining liability laws and now the  rules for highly automated vehicles. However, this creates a coordination issue as individual states determine their own individual rules. According to the NHTSA Federal Automated Vehicle Policy, states should allocate liability among highly automated vehicle owners, operators, passengers, manufacturers, and others when a crash occurs. The likely question to be raised is, if a highly automated vehicle is determined to be at fault in a car accident, then who will be held liable? Furthermore, it creates an issue if individual states set liability and fault levels at varying thresholds, with little coordination, leaving manufacturers, drivers, and suppliers scrambling when their product crosses state lines.

As major and minor manufacturers race to bring autonomous vehicles and systems to the market in the coming months, states are faced with allocating liability among those involved in accidents to a degree much more difficult than in the past. Unlike the traditional method of allocating liability, the autonomous vehicle and system adds a new player to this calculous, that of the manufacturer and supplier of the system. This element is largely unknown territory to major vehicle manufacturers and their suppliers which increases the need for more vigilance in system development, maintenance, and driver education than past iterations of emerging transit technology.

Please note Foley Summer Associate, Katrina Stencel was a contributing author of this post. The Dashboard Insights team thanks her for her contributions.

Written by:

Foley & Lardner LLP

Foley & Lardner LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide

This website uses cookies to improve user experience, track anonymous site usage, store authorization tokens and permit sharing on social media networks. By continuing to browse this website you accept the use of cookies. Click here to read more about how we use cookies.