We have heard a lot lately about the coming age of the “self driving car.” With Google, Tesla, and Uber fielding autonomous cars, and traditional automakers like Mercedes-Benz, BMW, and Infiniti not far behind, there has been a lot of concern over whether self-driving cars are safe. With the first self-driving cars have come the first self-driving car crashes, and now we are faced with the question: who is legally responsible when a self-driving car crashes?
As citizens, we quite reasonably expect that if a driver causes a car wreck that injures us or damages property, the driver is responsible for making it right. The law fulfills that expectation: if you can prove in court that the driver caused the wreck through a failure to use ordinary care or through a violation of a traffic law, the court will hold the driver (and through him, his auto insurer) liable for your damages.
But what happens if the car is being driven by a computer instead of a person? Machines have no legal responsibilities, and the court cannot hold the car itself liable for an accident. The court can only hold people and businesses liable.
Many commenters say that liability for accidents caused by self-driving cars will be born by the manufacturers under products liability principles. A 2014 paper by the Brookings institution suggests that products liability law is already adequate to address liabilities. However, there are a number of reasons that products liability law should not be the whole story:
First, under any products liability theory, a plaintiff would have to prove that the car or its software was unreasonably dangerous, was defective due to the negligence of the manufacturer, or breached a warranty. That would normally require expensive expert testimony, which is not normally required in a car-wreck case involving human drivers.
If we are injured when a self-driving car makes the same kind of mistake that would subject a human driver to liability, most of us would not expect to have to take the deposition of a computer programmer in Munich to see if he was negligent, or hire an expert to analyze computer code for defects. We would reasonably expect that the person who chose to give over control of his car to a computer probably ought to be held responsible when the computer causes a wreck.
This may already be the case in some jurisdictions. For example, the District of Columbia requires that any autonomous vehicle have a human driver “prepared to take control of the autonomous vehicle at any moment.” If the driver must be prepared to take control of the vehicle, the driver would surely be responsible for failing to prevent the car from causing a wreck. Until we have cars that have no pedals or steering wheels at all, someone in a self-driving car probably ought to be held responsible for ensuring that the car is operating safely.
A court-developed doctrine similar to the “non-delegable duty” doctrine may be the answer. In many circumstances, a person or business owes a duty that he or it cannot pass off onto another. For example, a property owner cannot relieve himself of the duty to keep his premises reasonably safe simply by hiring a property manager-he is liable for injuries if the property manager fails to carry that duty out. A driver normally owes the public a duty to operate his vehicle in a reasonable and prudent manner, and should not be able to relieve himself of that duty by “delegating” it to a computer.
Ultimately, these issues will probably be addressed by state legislatures, particularly when there are more reports of self-driving cars causing accidents. Some may adopt “no fault insurance” schemes such as those in place in many states already. Others may adopt statutes requiring a human operator, as D.C. does, and holding the operator responsible for the car’s safe operation. Ultimately, whether it comes from the courts or the legislature, the law ought to meet the reasonable expectations of the public when assigning liability, and that probably does not mean relieving owners or drivers of responsibility for wrecks caused by self-driving cars.