Tesla's Autopilot and FSD have been linked to hundreds of crashes and dozens of fatalities
An investigation report released by
Additional Information Regarding EA22002
(PDF file) https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf
Tesla's Autopilot and Full Self-Driving linked to hundreds of crashes, dozens of deaths - The Verge
https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death
In March 2023, a student who had just gotten off a school bus in North Carolina, USA, was rear-ended by a Tesla Model Y traveling at 'highway speeds.' The driver of the Model Y claims to have been using Autopilot. The 17-year-old student who was rear-ended by the Model Y was in danger of dying and had to be taken to the hospital by helicopter.
Following this accident, NHTSA conducted an investigation into hundreds of collisions involving Tesla vehicles. A report summarizing the findings was released on April 25, 2024. The report revealed that a combination of driver negligence and Tesla's technical defects had caused hundreds of injuries and dozens of deaths.
The investigation determined that drivers using Tesla's Autopilot or FSD 'were not adequately engaged in the task of driving,' and concluded that Tesla's technology 'did not adequately ensure that drivers maintained their attention on the task of driving.'
NHTSA is investigating a total of 956 crashes involving Tesla vehicles from January 2018 to August 2023. Of these crashes, some involved other vehicles colliding with Teslas, resulting in a total of 29 confirmed fatalities. Another 211 crashes involving Teslas colliding with vehicles or obstacles in their path were recorded, resulting in 14 deaths and 49 injuries.
In its report, NHTSA noted that Tesla's Autopilot and FSD 'are not designed to keep drivers focused on the task of driving. Tesla has warned that drivers should keep their hands on the wheel and their eyes on the road when using Autopilot and FSD, but NHTSA also noted that 'in many cases, drivers will become overly complacent and lose focus.'
In 59 crashes investigated by the NHTSA, Tesla drivers had more than five seconds to avoid striking another object. In 19 of these crashes, the driver had more than 10 seconds to avoid the collision. The NHTSA reviewed crash logs and data provided by Tesla, and noted that in the majority of the crashes it analyzed, 'drivers did not brake or steer to avoid the hazard.'
'Incidents in which drivers failed to take evasive action or attempted evasive action too late have been observed across all Tesla hardware versions and crash scenarios,' NHTSA wrote.
NHTSA also compared Tesla's Level 2 (L2) self-driving features to other companies' self-driving cars, pointing out that unlike other systems, Tesla's Autopilot works by disengaging the vehicle rather than allowing the driver to adjust steering, and is designed to 'discourage' the driver from continuing to engage with the task of driving.
'Our comparison of Tesla's design choices to those of its L2 peers reveals that Tesla is an industry outlier in its approach to L2 technologies, with a weak driver engagement system that is inconsistent with Autopilot's permissive operation capabilities,' NHTSA said.
NHTSA also pointed out that the name Autopilot is misleading and may lead drivers to believe that the system is beyond the driver's control. In response, competitors offer similar features using words such as 'Assist,' 'Sense,' and 'Team,' and NHTSA emphasizes that Tesla is confusing drivers with its name.
Tesla had voluntarily announced a recall at the end of 2023 in response to the NHTSA investigation and implemented a software update to add more warnings to Autopilot, but in response to the NHTSA's investigation report, it announced that it would launch a new investigation into the recall.
Related Posts:
in Ride, Posted by logu_ii