“Auto-Pilot” May Not Be Perfect, But It May Be Better

Cozen O'Connor
Contact

Virtually every day there are media reports regarding the introduction of driverless cars to mainstream consumers.  As driverless cars rapidly accelerate from concept to commercialization, it is becoming increasingly apparent that technological refinements remain necessary and are ongoing in response to recent accidents.

For instance, earlier this year in Europe a driver of a Tesla operating in autopilot mode ran into the rear of a van stopped on the highway. Then again, in July of this year, a Tesla operating in autopilot mode crashed on a windy two lane road in Montana. While neither of those drivers were seriously injured, on May 7, 2016, a driver of a Tesla in autopilot mode was killed in Florida when his vehicle failed to stop in the path of a tractor trailer. A spokesperson for Tesla indicated that the car was unable to differentiate between the white side of the tractor trailer against the backdrop of a bright sky.  This is the first reported fatality for any vehicle operating in a self-driving mode.  Although the family of the driver has not sued Tesla, they have reportedly retained counsel who is investigating the crash. 

Importantly, while Tesla refers to its self-driving technology as “autopilot”, it has stated that this does not mean that it is completely autonomous.  In fact, the owner’s manual for the Model S instructs that when the vehicle is operating in autopilot mode drivers need to keep their hands on the wheel and be prepared to take over at any time.  In a 2014 interview, Elon Musk, the founder and CEO of Tesla said the vehicles are not yet automated to the point where drivers can go to sleep and wake up at their destination. In general Musk has taken the position that regardless of the use of autopilot, the driver is always responsible for the safe operation of the vehicle. To this point, in the reported accidents involving Teslas operating in autopilot mode the drivers did not have their hands on the wheel as instructed.  This includes the driver in Florida, who appears to have been watching a movie when the crash occurred.

Unfortunately, not unlike other products, consumers often do not follow manufacturer’s warnings and will often pursue legal action despite failing to follow even the most explicit warnings and instructions. A touchstone of many products suits involves the expectations of the consumer. As such, manufacturers can expect drivers to argue, for example, that the term “autopilot” conveys a meaning that the vehicle would be capable of operating without driver assistance.  Litigants in product liability lawsuits will also likely argue that although an owner’s manual may provide warnings when operating vehicles in self-driving mode, these warnings do not go far enough to protect the vehicle occupants.

Although strict liability is a fairly common standard for product liability lawsuits in the U.S., manufacturers are far from defenseless.  Causation is likely to remain king – and plaintiffs will need to establish that a defect in the automated operation of the vehicle is what caused the accident.  Absent evidence of a defect, or that the defect is responsible for the result, plaintiffs may find themselves in hot pursuit of nothing.  Even if causation could be established, manufacturers can certainly assert contributory (or comparative) negligence, assumption of the risk, and misuse, among other things.

There has also been regulatory activity examining this technology to ensure its safety.  For example, currently the German transport ministry is considering requiring manufacturers of vehicles with self-driving capabilities to install “black boxes” to record information to determine causes of future accidents. In the U.S., NHTSA administrator Mark Rosekind recently indicated that his agency will continue to work toward establishing effective regulations for driverless vehicles, but he added that the U.S. Department of Transportation was committed to the continued development of self-driving vehicles.  In support of self-driving technology, Rosekind cited that of the 32,500 traffic fatalities on U.S. roads in 2015, 94 percent of these accidents were caused by driver error.  He indicated that highly automated vehicles could eliminate 19 out of 20 vehicle accidents and could save a significant number of lives.

While product liability litigation involving self-driving cars is certain to follow widespread introduction and adoption of the technology, the defense costs are likely to be marginalized by the incredible impact these cars can have on safety.  In spite of a handful of accidents, industry leader Tesla has reported that its vehicles have safely logged more than 140 million miles while operating in autopilot mode.  Indeed, driverless cars promise fewer vehicle accidents and ultimately fewer lawsuits.  If that is the case, it seems time for everyone to jump on-board.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Cozen O'Connor

Written by:

Cozen O'Connor
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Cozen O'Connor on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide