Visit our website
New America Cypbersecurity Initiative
New America Cypbersecurity Initiative
MIT Technology Review
MIT Technology Review
io9
io9
Techdirt
Techdirt
Knowledge@Wharton
Knowledge@Wharton
Bioscience Technology
Bioscience Technology
redOrbit
redOrbit
Technology & Marketing Law Blog
Technology & Marketing Law Blog
Popular Science Blog
Popular Science Blog
Pew Research Center
Pew Research Center
Genomics Law Report
Genomics Law Report
Science 2.0
Science 2.0
The Guardian Headquarters
The Guardian Headquarters
Genetic Literacy Project
Genetic Literacy Project
Disclaimer

Statements posted on this blog represent the views of individual authors and do not necessarily represent the views of the Center for Law Science & Innovation (which does not take positions on policy issues) or of the Sandra Day O'Connor College of Law or Arizona State University.

Marchant Quoted in ABA Journal Article

A hand or hands gripping a steering wheel will soon be a thing of the past.  Safer, automated technology is forging ahead very quickly, turning human drivers into carefree, unanswerable passengers.  No more worries about which driver is at fault. Right? Or is attaching liability more complex now?

Recall the driver that was killed in 2016 behind the wheel of his Tesla Model S.  According to an ABA Journal article posted last month, the Tesla autopilot technology was not sophisticated enough to recognize the danger awaiting the 40-year-old driver.  Therefore, the National Highway Traffic Safety Administration concluded Tesla was not at fault in the incident.

So who should be at fault in such situations — especially when “highly”automated systems become the norm?

According to LSI Faculty Director, Gary Marchant, “[m]ultiple defendants are possible: the company that wrote the car’s software; the businesses that made the car’s components; the automobile manufacturer; maybe the fleet owner, if the car is part of a commercial service, such as Uber.”

Identifying who could or should be at fault is one matter.  Another is determining the legal standard that should apply.  Some experts suggest strict liability.  However, Marchant disagrees, stating, “[t]here will be far fewer accidents with HAVs [highly automated vehicles], but when they occur the vehicle’s manufacturer will be sued. So carmakers will have more liability than they do now for making a safer product.”  As Marchant points out, fairness is a concern here.  Also, in the current digital age, one can’t be blind to the possibility that autopilot systems may be hacked, in which case, under a strict liability standard, the culpable party would have to establish that the system was not sold in an unreasonably dangerous defective condition.

Interestingly, the ABA Journal article states, “the car did not brake, nor did it issue any warning to Brown.” However, it has been confirmed that the Tesla system gave the fatally injured driver multiple warnings, which for an unknown reason he did not, or could not, heed.