On March 24 of this year, the ride-sharing company Uber pulled its fleet1 of self-driving vehicles off the roads in Arizona after one of its vehicles was involved in an accident in Tempe. This follows similar suspensions of service in San Francisco and Pittsburgh after Uber's self-driving cars were caught running red lights. Although the car involved in the Arizona accident had a driver behind the wheel at the time of the crash, it was in self-driving mode and was struck by another car that failed to yield to it. These types of incidents highlight the growing pains of a new transportation industry that many predict will eventually render driving a personal vehicle a thing of the past.
However, we have quite a way to go until we get to a completely driverless future, and until we do, these vehicles will have to share the road with human drivers. A major legal issue facing the self-driving car industry revolves around the issue of who is responsible when a self-driving vehicle gets into an accident. With traditional human-powered vehicles, the driver who is at fault for the accident is responsible for it; but what if there is no human driver to be held accountable? How can a person who is injured in an accident with a robotic car recover? These questions are very new, but the courts, federal agencies, and the technology industry are all working to come up with answers to them.
Below are a few of their ideas.
Theories of Liability
Hold the Owner of the Car Liable
The most traditional and obvious approach to these issues would be to simply hold the owner of the vehicle legally liable2 if it gets into an accident. Makes sense, right?
Well, here's the problem with that: The owner of a self-driving car is not actually the one who is driving it. Rather, self-driving cars are controlled by software that tells the car how to behave in certain situations, and if there is an accident, it was because the software malfunctioned, not because the owner of the car was negligent. Thus, it would be very difficult to predicate legal liability on the owner of the vehicle for an accident that was entirely the fault of the self-driving car's software.
Hold the Manufacturer of the Vehicle Liable
Another of the more obvious solutions is just to hold the company that produces the self-driving car responsible for accidents it causes. After all, the technology company that built the car is the one that programmed the car's software. If that software malfunctions and causes an accident, then it is clear that the manufacturer should be liable. The problem with this scenario is that it would effectively kill the self-driving car industry. If every tech company that produces self-driving automobiles is legally liable for any accidents they cause, virtually no company would be willing to take on that level of risk. On the upside, there would no longer be an issue of self-driving vehicle liability because there likely would be no more self-driving vehicles.
Hold the Car Liable as a Legal Person
"Legal personhood" is a distinct concept that is much broader than the common definition of "personhood." The legal concept of personhood is not so much concerned with who is or is not a human person, but rather, who (or what) is able to be hauled into court. A notable example of non-human legal persons are corporations; they have no flesh or blood, but they are certainly subject to legal liability. In his book Robots are People Too,3 lawyer John Frank Weaver argues that the law should recognize robots as legal persons so that they can be treated as insurable entities similar to corporations and people. That way, the robot's legal liability would be self-contained. The benefits of a system like this are two-fold in that it allows the owner of the self-driving vehicle and the company who produced it to both avoid liability for accidents. The disadvantage of a system like this would be that, by not assigning liability to individual human people, this could lead to a mass evasion of personal responsibility.
Proposed Legislation and Policy
State legislatures and federal agencies are already hard at work trying to implement legislation to regulate the self-driving car industry. As of November of 2016, seven states4 (California, Nevada, Utah, North Dakota, Arizona, Tennessee, Florida) and the District of Columbia have enacted legislation regulating self-driving cars either through legislation or executive order, while one other (Michigan), is in the process of doing so. The federal government also plans to get into the self-driving car industry regulation game through the National Highway Traffic Safety Administration (NHTSA). While regulation of drivers through licensing and vehicle registration is traditionally a state responsibility, the federal government intends to regulate the vehicle safety side of self-driving cars.
In September of 2016, the United States Department of Transportation issued a policy paper titled "Accelerating the Next Revolution Roadway Safety"6 to set out the agency's policies for the safe testing and deployment of automated vehicles. The report contains four sections, each of which is summarized below:
15-Point Safety Assessment: This section of the paper identifies a safety assessment for autonomous vehicle manufacturers that would cover the following areas: data recording and sharing, privacy, system safety, vehicle cyber security, human machine interface, crashworthiness, consumer education and training, registration and certification, post-crash behavior, federal, state, and local laws, ethical considerations, operational design domain, object and event detection and response, minimal risk condition, and validation methods.
Model State Policy: This section outlines the distinctions between federal and state responsibilities for the regulation of automated vehicles, in which the states maintain their traditional regulation of automobiles through licensing, enacting and enforcing traffic laws, and regulating insurance and liability, while the federal government regulates the software, safety, and performance side of automated vehicles.
Current Regulatory Tools: Includes a discussion of the NHTSA's current regulatory tools that can be used to ensure the safe development of new technologies, such as interpreting current rules to allow for greater flexibility in design and providing exemptions to allow for testing of non-traditional vehicles.
Modern Regulatory Tools: Includes a discussion of new regulatory tools and statutes that policy makers should consider in the future to aid in the safe and efficient deployment of automated vehicle technologies.
Call the Dolman Law Group Accident Injury Lawyers, PA Today to Schedule a Free Consultation
If you have been hurt in an accident involving a self-driving car, you should speak to an attorney as soon as possible. To schedule a free case evaluation with one of our lawyers, call the Dolman Law Group Accident Injury Lawyers, PA today at 727-451-6900 or contact us online.
800 North Belcher Road
Clearwater, FL 33765
(727) 451-6900
https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.asp