HOME People & Events

Elaine Herzberg’s death forces rethink of self-driving car safety standards

2025.09.17 20:57:31 Bomi Han
11


[Autonomous car. Photo Credit to Unsplash]

In 2018, Elaine Herzberg, aged 49, was hit by an autonomous vehicle as she wheeled her bicycle across a road in Tempe, Arizona.


The accident was the first death on record involving a Level 4 self-driving car, and resulted in Uber ending its testing of the technology in Arizona.


Uber was not criminally charged in connection with this accident, while Ms. Vasquez, who was the car's safety driver, was sentenced to three years of supervised probation on related charges. 


This incident started a lot of discussion about liability in self-driving car accidents as people’s attention has focused intensely on the dangers of autonomous vehicle incidents following this tragic event.


Given the multiple factors that determine liability in autonomous car accidents on a case-by-case basis, the outcome of this case has become an important precedent to demonstrate how future legal decisions should be structured and establish charging standards.


Although there is no complete standard or solution for this issue yet, many experts are working to address these concerns. 


Autonomous cars, also known as self-driving cars, are automobiles that employ driver assistance technologies to remove the need for a human operator. 


There are 6 stages of automation levels that represent how autonomous the car is, as defined by the Society of Automotive Engineers and subsequently adopted by the U.S. Department of Transportation. 


The stages of autonomous cars where controversy over accident responsibility arises are Level 4, which currently exists, and Level 5, which remains theoretical.


In Level 4, a vehicle is fully self-operational within set boundaries, requiring no attention or assistance from a human driver, and may not include features such as pedals or a steering wheel.


For example, Waymo and Tesla’s Robotaxi are considered Level 4  autonomous cars.


These vehicles are at the center of responsibility debates because, unlike lower levels, Level 4 vehicles are designed to operate without requiring attention or assistance from a human driver. 


This means there is minimal human involvement  in driving decisions, which directly raises questions about whether a human passenger should be held liable for accidents when they played no active role in the incident.


Several parties may be held liable for autonomous vehicle accidents, including vehicle manufacturers and software developers.


Since these vehicles function both as transportation and as the decision-making driver, various stakeholders involved in the product’s creation may bear responsibility for accidents.


A vehicle manufacturer who designs, engineers, and produces motor vehicles can be held responsible for self-driving cars if the crash was caused by a defect in the car’s hardware or software.


Similarly, software developers may be held liable since the algorithms and software that power autonomous vehicles play a critical role in their performance.


If a software error or bug causes a crash, the software developer could be held responsible.


Despite the potential for manufacturer and software engineer liability, the most common legal perspective maintains that humans retain some level of responsibility.


Autonomous cars are not only the evidence of development of human technology, but also the invitation that raises the unprecedented issue of the responsibility of car accidents. 


These issues are mostly focused on how liability is evaluated by different accident scenarios and how responsibility is distributed among various parties involved.


In the future, establishing clear standards and legal frameworks for autonomous car accidents will be essential for ensuring safer and more legally defined operation of autonomous cars on public roads.

Bomi Han / Grade 11 Session 3
Thornhill Secondary School