Founding Member & Managing Partner at Gina Corena & Associates
Practice Areas: Personal Injury
Self-driving cars are reshaping the future of transportation, yet concerns about their safety have sparked significant debate. While autonomous vehicles promise convenience and reduced human error, many Americans remain hesitant to embrace the technology. According to a survey, only 21% of respondents would feel comfortable riding in a driverless car. But is this fear justified?
How often do self-driving car accidents really occur, and what does the data reveal?
Recent findings show that many U.S. drivers still have reservations. In a 2024 survey conducted by AAA, 66% of participants expressed fear of fully autonomous vehicles, while another 25% were uncertain.
Despite this reluctance, there’s strong interest in semi-autonomous technologies like Reverse Automatic Emergency Braking (AEB) and Lane Keeping Assistance, suggesting that incremental progress in vehicle automation could help ease public concerns. The key challenge for the industry lies in consistently improving these technologies and maintaining reliable performance to boost consumer confidence.
Autonomous vehicle companies have been required to submit crash data to the National Highway Traffic Safety Administration (NHTSA) since June 2021, though reports date back to August 2019. By mid-2024, 3,979 incidents involving autonomous vehicles had been recorded, with 473 of those occurring in the first half of 2024 alone.
This data includes vehicles equipped with Automated Driving Systems (ADS) and Advanced Driver Assistance Systems (ADAS).
Looking at yearly trends, 2022 saw the most incidents, with 1,450 accidents involving autonomous vehicles. Here’s a breakdown of reported accidents by year:
Year | Number of Incidents |
2019 | 4 |
2020 | 25 |
2021 | 641 |
2022 | 1,450 |
2023 | 1,353 |
2024 (through June 17) | 473 |
While the number of self-driving car accidents remains relatively low compared to traditional vehicles, the upward trend underscores the need for ongoing improvements in safety systems and regulations. As the industry advances, balancing innovation with safety will be critical in addressing public apprehension and ensuring widespread adoption of autonomous vehicles.
Data shows that self-driving car accidents are increasing as the technology becomes more common. However, it’s important to note that not all “self-driving” cars are fully autonomous. The National Highway Traffic Safety Administration (NHTSA) classifies cars into five levels of autonomy, ranging from minimal assistance (Level 1) to fully driverless (Level 5). Currently, most vehicles on the road are at Level 1 or 2, where drivers still play a significant role.
The level of autonomy directly affects accident rates. Fully autonomous cars, like Waymo’s in cities like Phoenix and San Francisco, are expected to reduce accidents. However, these Level 4 vehicles are still limited in availability. As technology advances, the hope is that higher levels of automation will further decrease the number of accidents.
Data shows that self-driving car accidents are rising as the technology becomes more widespread. However, it’s important to understand that not all “self-driving” cars are fully autonomous. The NHTSA categorizes vehicles into five levels of autonomy, from minimal assistance (Level 1) to fully autonomous (Level 5). Most vehicles on the road today are at Level 1 or 2, where drivers still have significant control.
Fully autonomous cars, like those operated by Waymo in partnership with Lyft for a rideshare service, are beginning to appear in select cities like Phoenix and San Francisco. These Level 4 cars can drive without human intervention under certain conditions. Although these services are limited, they are part of the industry’s effort to reduce accidents caused by human error.
Advocates believe self-driving cars can drastically lower accident rates. The NHTSA reports that 94% of vehicle crashes result from human error, such as distracted driving or drunk driving. Autonomous technology aims to eliminate these mistakes, with quicker response times and better decision-making than human drivers.
Despite this potential, accidents involving self-driving vehicles continue to rise. While the technology shows promise, it’s not yet foolproof. Services like Lyft and Waymo offer a glimpse of what the future might hold, but significant advancements are needed before self-driving cars can truly reduce accidents on a large scale.
While self-driving cars hold promise, they are not without risk. According to the NHTSA, there were 392 crashes involving cars with self-driving technology over 10 months. These accidents resulted in six fatalities and five serious injuries. Tesla accounted for a large share of these incidents, with 273 crashes, five of them fatal, involving vehicles equipped with its Autopilot driver assistance program. Other automakers involved in fewer accidents include Honda, Subaru, Ford, GM, BMW, Volkswagen, Toyota, Hyundai, and Porsche.
One common cause of accidents is mechanical failure within self-driving technology. A malfunctioning sensor or camera may fail to detect a hazard, leading to situations where the vehicle might stray into the path of oncoming traffic, causing a collision. These systems require flawless operation, and even a small failure can have serious consequences.
Another significant factor is driver complacency. Drivers may rely too heavily on the car’s assistance features, leading to a false sense of security. However, the driver must remain alert and be ready to take control of the vehicle if necessary. If the driver’s reaction time is too slow, an accident may occur, especially in unexpected situations.
“Warning fatigue” is another issue linked to self-driving car accidents. Driver assistance systems often issue frequent alerts when human intervention is needed. Too many warnings can overwhelm or distract the driver, leading them to ignore important alerts or make critical errors. In such cases, a delayed or missed response to these warnings can result in an accident.
Determining liability in a self-driving car accident can be complex. If the vehicle’s driver assistance technology fails to detect a hazard—such as an approaching vehicle or a pedestrian—the manufacturer could be held liable for any resulting damages. This is particularly true if a mechanical failure prevents the system from issuing timely warnings, leaving the driver unaware of an immediate danger.
In some cases, the human driver may be liable, especially if they are not paying proper attention. Drivers of vehicles with autonomous features are still required to be able to take control of the car when needed. If a driver is distracted, drowsy, or impaired by alcohol or drugs, they can be held responsible for accidents. In a well-known case involving Lyft and Google’s Waymo partnership, a safety driver was found negligent for watching television instead of the road when the vehicle struck and killed a pedestrian.
Self-driving car accidents add another layer of complexity to dealing with the aftermath of a crash. Managing medical bills can be particularly challenging if you are an out-of-state resident or involved in specific circumstances, such as an accident with an Amazon delivery driver.
The situation can worsen if the other party is uninsured, as seen in many uninsured motorist accidents in Las Vegas. Beyond the physical injuries, victims often face emotional trauma, such as post-accident PTSD, adding to the ordeal.
If you’ve been involved in a self-driving car accident, it’s crucial to have skilled legal representation. Our experienced car accident lawyer in Las Vegas is dedicated to providing personalized attention and professional support for your case. Contact us or call 702-680-1111 for a free consultation.
As founder of Gina Corena & Associates, she is dedicated to fighting for the rights of the people who suffer life-changing personal injuries in car, truck and motorcycle accidents as well as other types of personal injury. Gina feels fortunate to serve the Nevada community and hold wrongdoers accountable for their harm to her clients.