By Justin Garlisi
Self-driving cars are an emerging technology gradually growing in its number of safety features.[1] The early 2000s marked the growth of automated features such as collision and lane departure warnings, and these features have become increasingly complex.[2] The National Highway Traffic Safety Administration has documented the various degrees of possible automation, ranging from minor driving assistance to full automation.[3] Automated cars capable of driving under all circumstances without any human input are currently unavailable to the public, but such cars will possibly make their debut over the coming years and decades.[4]
Once self-driving cars are commonplace, legislatures will have plenty of issues to consider.[5] Normally when a car crashes into someone, the injured person could sue the driver. But what if the car was self-operating? Who the injured person should sue—the owner, the manufacturer, the seller, or even the car itself—is not immediately obvious.[6] And in minor accidents between one or more self-driving cars, it is unclear which of the owners’ insurance companies should pay.[7]
Despite the abundance of today’s criminal and traffic laws regulating conduct on the road, most of them assume that there is a human driver controlling a vehicle; the provisions in these laws fail to consider the possibility that drivers could one day be divorced from their cars.[8] That is not to say legislation on self-driving cars is a barren wasteland. In fact, several states had passed legislation regarding autonomous vehicles over the past several years, with Nevada legalizing its use as early as 2011.[9] Rather, there is more work to be done in addressing the reality of a car driving without a driver.[10] When people involved in self-driving accidents litigate in court, the judge may consider how the divorce between driver and car affects what type of legal theories should apply.
Liability questions are a pressing concern, given that some self-driving cars presently available have gotten into fatal accidents by plowing into other cars or pedestrians.[11] For example, in 2019, a driver in California named Kevin enabled his Tesla’s autopilot system and, to his dismay, the Tesla charged through a red light and smashed into another car, killing the two people inside it.[12] The prosecutors leading the case charged Kevin with two vehicular manslaughter charges.[13] He pled not guilty.[14] The judge put him on probation, sentenced him to community service and other work, and placed him under house arrest.[15]
What happens to people like Kevin depends on either what legislatures mandate or, if the legislatures leave liability questions unanswered, the judges must adopt a case-by-case approach.[16] The holdings of past cases could, through analogy, make themselves applicable to self-driving cars.[17] And legal theories such as negligence, strict liability, and express and implied warranties can arguably apply to self-driving cars.[18] But this is only a start. As manufacturers increase their vehicles’ level of autonomy, the divorce between driver and vehicle will grow more apparent. Legislatures should think carefully about how the law should handle fully autonomous cars before they become widespread.
[1] See Automated Vehicles for Safety, Nat’l Highway Traffic Safety Admin., https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety (last visited Sept. 18, 2023).
[2] See id. (providing a diagram that outlines the progress of automated safety features).
[3] Id. (describing six levels of automation).
[4] See id.
[5] See Mbilike M. Mwafulirwa, The Common Law and the Self-Driving Car, 56 U.S.F. L. Rev. 395, 398 (2022) (considering the implications of a self-driving car getting into a fender-bender).
[6] Id.
[7] Id.
[8] See Spencer C. Pittman & Mbilike M. Mwafulirwa, Not So Hypothetical After All: Addressing the Remaining Unanswered Questions About Self-Driving Cars (Mar. 2019), https://www.okbar.org/barjournal/mar2019/obj9003pittmanmwafulirwa/ (“Our state’s formidable arsenal of existing criminal and traffic laws have one glaring shortfall when it comes to self-driving cars – they are all enforced on the assumption that a human driver operated the vehicle.”).
[9] See Autonomous Vehicles | Self-Driving Vehicles Enacted Legislation, Nat’l Conf. of State Legislatures (Feb. 18, 2020), https://www.ncsl.org/transportation/autonomous-vehicles.
[10] Mwafulirwa, supra note 5, at 399 (“Instead of a comprehensive civil and traffic framework for self-driving cars, however, we have a hodgepodge of liability rules.”).
[11] Phil McCausland, Self-Driving Uber Car That Hit and Killed Woman Did Not Recognize That Pedestrians Jaywalk, NBC News (Nov. 9, 2019, 3:28 PM), https://www.nbcnews.com/tech/tech-news/self-driving-uber-car-hit-killed-woman-did-not-recognize-n1079281.
[12] Joseph Guzman, Tesla Driver Faces Felony Charges in Fatal Crash Involving Autopilot, The Hill (Jan. 19, 2022), https://thehill.com/changing-america/well-being/590473-tesla-driver-faces-felony-charges-in-fatal-crash-involving/.
[13] Id.
[14] Id.
[15] Simon Alvarez, Tesla Driver Behind Tragic Model S Crash in CA Gets Probation, Teslarati (Jun. 30, 2023), https://www.teslarati.com/tesla-driver-autopilot-crash-ca-gets-probation/.
[16] Mwafulirwa, supra note 5, at 402.
[17] Id. at 396 (“The way of the common law, as Oliver W. Holmes Jr. and Benjamin N. Cardozo told us, is reasoning by analogy informed by tradition.”).
[18] Id. at 403.