Autonomous Vehicles Are Here:
What Does that Mean For Us?

By Barbara M. Pelaez, Quintairos, Prieto, Wood & Boyer, P.A.
Sadie Horner, The Bassett Firm
This article was previously published on 07.24.18 & 08.07.18 for The Bassett Firm blog.

No longer limited to pop culture and science fiction, autonomous vehicles are real and are here to stay. Although autonomous vehicles have been in development since the 1920s, only recently has there been a big push from entities such as the National Highway Traffic Safety Administration (“NHTSA”) and the Department of Transportation (“DOT”) for these types of vehicles. One explanation for this recent push is that autonomous vehicles (“AV”) are expected to reduce MVAs by 80% to 90%, since the “NHTSA” estimates 94% of motor vehicle accidents are caused by human error. But what exactly does “autonomous vehicle” mean?

The NHTSA recognizes 5 levels of automation in the context of motor vehicles:

The first three levels (0-2) are the vehicles with which most current drivers are acquainted-for example:

  • A 1994 Jeep Wrangler with a manual transmission (and no cruise control) is generally considered a Level 0 or No Automation.
  • A 2010 Nissan Maxima with cruise control would be a Level 1 or driver assistance vehicle.
  • A 2014 Honda Accord with a lane keeping assisted system and adaptive cruise control is a Level 2 or partial automation; such vehicles are the most common on the road these days.

The differences between Level 1 vehicles and Level 2 vehicles are not well defined; however, generally, Level 2 vehicles consist of more automated features such as accelerating and deaccelerating, steering assistance, and braking. For these lower level automated vehicles, a driver must pay attention and monitor the environment at all times.

Level 3 vehicles, which are also called conditional automation vehicles, are on the market and are becoming widely available. The driver must still be ready to take control of the vehicle at all times with level 3, since the technology is still being developed. For example, a 2018 Tesla’s autopilot is a level 3 or conditional automation, and the driver is necessary to the safe operation of the vehicle and must be ready to take control of the vehicle at all times.

In contrast, for Level 4 and Level 5 vehicles, a driver would NOT be required to monitor the environment when using the autonomous features. Although there are NO Level 4 or Level 5 vehicles on the market at this time; Hollywood has given us several examples:

  • The Audi RSQ sport coupe from I, Robot (2004) appears to be a level 4 or high automation-since Will Smith is able to take control of the vehicle.
  • The Lexus 2054 from Minority Report (2002) appear to be level 5 or full automation i.e. driverless.

Auto manufacturers, software companies, and a variety of startups are coming together to create and implement different prototypes of Level 4 and Level 5 vehicles, including Google/Waymo, Apple, Uber, Tesla, GM, Ford, Toyota, Nissan, Volvo, BMW, Mercedes, and Audi. Some of the prototypes for level 5 full autonomous vehicles do not even have steering wheels, accelerators, or brakes, meaning the human occupant will not be able to operate the vehicle.

It is likely that fully automated vehicles will be on the market sooner than you think; however, auto manufacturers are still seeing issues with human error and Level 3 vehicles. For this reason, some manufacturers – Ford, for example – are hoping to bypass Level 3 vehicles in favor of developing Level 4 autonomous vehicles. Such manufacturers suggest it is unfair to ask a driver to “instantaneously intervene” when an issue presents itself when in the high automated state found in Level 3 vehicles.

Automation-whether full or partial-presents risk. DOT and NHTSA, which is responsible for regulating the safety, design, and performance of autonomous vehicles, are implementing guidelines to allow for the full testing of autonomous vehicles in the market place without states’ legislatures interfering. However, States will still be responsible for regulating the human driver in most aspects of vehicle operation, including driver licensing, vehicle registration, and titling, and ensuring the traffic laws do not hamper AV technology.

This division of labor leaves us with an interesting question-in the context of motor vehicle accidents, who will be responsible if something goes wrong?

Now you know all about Autonomous Vehicles, but an issue remains: who is responsible when something goes wrong?

As we continue the transition to higher automation vehicles, liability and responsibility will become a murkier concept. After all, who is truly at fault when a highly automated vehicle is involved in a collision: (a) the driver, (b) the software company/manufacturer, or (c) some combination of the two?

At least one auto manufacturer seems to think that higher automation means liability on the manufacturer. In 2015, Volvo publicly stated that Volvo would accept all liability when one of their autonomous vehicles is involved in an incident/accident when in autonomous mode.1

On the other hand, the NHTSA seems to think the question of liability – at least for Level 3 vehicles – is still an issue of driver error. For example, in May 7, 2016, a Tesla S driven by Joshua Brown was involved in a motor vehicle accident. This crash is considered the first Level 3 autonomous vehicle crash. Mr. Brown was known for over relying on the autonomous system of his vehicle, and at the time of this accident he had turned on the auto-pilot and set the cruise control. When a tractor-trailer came out of an intersecting road, the system did not properly register or respond to this change in road conditions and a collision occurred. However, the NHTSA determined that since the vehicle was a Level 3 AV, the driver was still in charge and he failed to pay attention; therefore, the cause of the accident was human error.

The National Transportation Safety Board (“NTSB”) has stated that auto manufacturer must be pro-active in curtailing driver overreliance on the automation. With this in mind, the NTSB has placed at least some responsibility on the auto manufacturers to implement restrictions designed to prevent the human driver from becoming over reliant on the AV system.2 In other words, automated vehicle control systems should have limitations that restrict operation of the system to conditions for which such systems were designed and are appropriate. For example, Tesla has improved their Autopilot by adding new limits on hands-off driving and other features to monitor the driver’s attention. Therefore, liability can potentially be placed on auto manufacturers for not properly designing the software system to stop driver overreliance.

So, how will liability be determined via a regular negligence claim, products liability, or no fault litigation? Recently, a potential for a test case arose when Uber’s self-driving test vehicle fatally struck a pedestrian while in autonomous mode. However, the accident occurred on March 19, 2018, and by March 29, 2018, Uber had reached a settlement with the deceased pedestrian’s family.3

Since we are still dealing with Level 3 autonomous vehicles, driver input and attention is still required, and negligence suits against drivers are likely to remain the most common claims arising from motor vehicle accidents.

When Level 4 and Level 5 autonomous vehicles enter the market place, which the NHTSA states should be around 2025,4 driver negligence is likely to fade as the focus of motor vehicle litigation, and suits against manufacturers and sellers may become the norm.

Automation technology is here, and higher automation of motor vehicles is likely to become more common place. For now, the duty to exercise ordinary care remains on drivers of motor vehicles, even Level 3 automated vehicles. However, the focus of responsibility and liability in the context of motor vehicle accidents is likely to shift as even higher levels of autonomous vehicles are introduced to the driving public. Be warned and be prepared. A change is coming.

1. http://fortune.com/2015/10/07/volvo-liability-self-driving-cars/
2. William B. Pentecost, Jr., Autonomous Vehicles: Is the Florida Tesla Crash a Harbinger of an Onslaught of Design Defect Claims, or Is It Simply Another Case of Driver Negligence?, The Transportation Lawyer, at 16.
3. https://www.npr.org/sections/thetwo-way/2018/03/29/597850303/uber-reaches-settlement-with-family-of-arizona-woman-killed-by-driverless-car
4. https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety

_____________________________

Articles on the DAYL website are provided for informational use only, and are in no way intended to constitute legal advice or the opinions or views of the DAYL.