In this hoganlovells.com interview, Dr. Sebastian Polly talks about how connected cars and automated/autonomous vehicles (AV) will impact the concept of product liability. He also addresses how automakers and suppliers will need to adapt their approach to product safety and product compliance to manage the risks associated with current and future mobility trends. “Players in this space are reinventing the car over and over, generating a lot of questions that need to be answered,” said Polly.
How have the connected car and AVs changed the way people think about product liability?
Polly: Typically when one talked about product liability, they were only thinking about civil law liability — meaning a person injured in an accident that was allegedly caused by a defective product. That would trigger civil law liability — meaning you would owe compensation to the injured person for the material losses, like the cost of medical treatment, or for immaterial losses, like pain and suffering. Moreover, currently when we talk about liability and driving — it is one driver versus another driver due to an alleged driving error. That’s the majority of product liability cases in the automotive industry today, and that doesn’t affect automakers or suppliers very much. There are very few technical defects in today’s cars and there are even fewer accidents triggered by technical issues in cars. However, that was the old concept.
We will experience a massive shift in liability. The vehicle will step by step take over more responsibility and will perform more driving maneuvers — from highly and fully automated up to fully autonomous driving. If an accident happens, it will not be one driver versus another driver. It will be traffic victims either outside the vehicle or passengers inside the vehicle versus the person responsible for the vehicle — typically the automaker or a supplier because they were responsible for an alleged defect in the vehicle.
Will liability for vehicle accidents shift from the individual driver to the automaker and suppliers?
Polly: The overall number of auto accidents will go down because automated and autonomous cars will be much safer. But the accidents that do occur might become the responsibility of automakers and suppliers, which will have to defend their products. Now when we talk about connectivity and AVs, the concept of liability broadens. It’s not just civil law liability anymore — it’s also product safety and product compliance. They form a kind of triangle.
If there is an alleged defect in a car, the car could be considered unsafe and that in turn might trigger a recall. The recall could be more dangerous and challenging for the automaker or supplier than the actual accident they are dealing with. If there is a safety allegation, a company could have thousands or millions of vehicles in its fleet with the same program and algorithms that are instantaneously considered unsafe. That could trigger an unprecedented safety issue.
What is the industry standard for product compliance and product safety as it applies to connected cars and autonomous vehicles?
Polly: If you want to sell a connected or automated/autonomous car, you need to assure your customers that your product is free from defects and safe. But how can you know it is safe? What is the correct safety standard? In a highly innovative and constantly developing market, defining product compliance is a massive challenge for the entire industry at the moment.
The laws in the EU state that a product, including a car, needs to meet reasonable safety expectations. But what is your reasonable safety expectation regarding an automated car? Does it have to be as good as you? Or does it have to be as good as the average driver? Or does it need to be perfect in terms of being able to avoid any and all accidents? Or is that reasonable safety expectation somewhere in between?
Could automakers, suppliers, or company employees be found criminally liable for car accidents?
Polly: The biggest threat to the automakers and suppliers in terms of product compliance is the criminal product liability that comes along with it. If there is an accident caused by a defect in the vehicle’s programming and a person dies, is the developer or programmer criminally responsible for the death? This would then lead to a new concept. And a company cannot insure itself against criminal liability.
If there was a public prosecution investigation following an accident, an investigator might discover that a developer or manager signed off on an aspect of the car and potentially did something negligent when it comes to the duty of care. As a consequence, they might be exposed to criminal product liability risks. In a worst case scenario, one cannot even exclude that a public prosecution might try pressing charges against an engineer, programmer, or manager for negligent manslaughter.
At Hogan Lovells, our role is to help protect the automotive companies and the individual decision makers from product liability and criminal product liability risks. We brief our clients on these issues and train them so they understand the legal challenge they are personally working against. We then help them to assess the legal challenges and give them advice on how to properly handle them. If we help companies to navigate these challenges, it will be very hard later on for somebody else to tell a developer or manager that they acted negligently.
We also help companies to create an accurate and proper paper trail. If something goes wrong five years from now and you have a bad paper trail during the development phase that might create an incorrect impression, it could expose the company to unnecessary risk during a lawsuit.
What are the ethical dilemmas AVs will have to be programmed to handle and act upon?
Polly: There are for example ethical dilemmas that go along with autonomous driving. What is the autonomous vehicle supposed to do if it suddenly identifies a human walking across the street who isn’t supposed to be there? The car may not have enough time to break, leaving it two options. One option is to run over the person walking across the street and the other is to take an evasive action. But if the car takes an evasive action, is the car allowed to hit something else? Running someone over is typically not very dangerous for the passenger in the car. But if the car takes an evasive action and hits a tree or wall, that could be very dangerous for the passenger.
What’s the ethical move that the car needs to take? Protect the person inside or outside the car? Let’s assume we could answer that question. A company then has to ask itself — what do our customers want the car to do? Depending on the outcome, customers might be unwilling to trust the technology and buy it. It’s a very delicate concept. However, it also outlines the massive challenges that the automotive industry is facing.