Tesla employee may have become the victim of the first fatal accident due to ``Fully Autonomous Driving (FSD)''

On May 16, 2022, a Tesla car crashed into a tree on the side of the road and burst into flames in the suburbs of

Denver , Colorado . The person who died in this accident was a Tesla employee in charge of recruiting engineers, and it is possible that this employee was using the beta version of Full Self-Driving (FSD). This is reported by the daily Washington Post.

Tesla employee in fiery crash may be first 'Full Self-Driving' death - Washington Post

Hans von Ohain, who worked at Tesla as an engineer recruiter, went golfing with his friend Eric Rossiter on May 16, 2022. Mr. von Euhan drove a Tesla car even in his private life, and was said to be using FSD, a driver assistance software that he received for free as an employee benefit. However, it seems that the FSD was not able to run well on the mountain road leading to the golf course, and Mr. von Oehan was often steering the car that was about to veer off the road.

Mr. Rossiter, who was a passenger in the Tesla car, recalled the time, ``The first time it happened (Tesla car trying to veer off the road with FSD), I thought, ``Is this normal?'' And he said, ``Yeah, that happens sometimes.''

The two reportedly played golf together and had a few drinks along the way. On the way home, von Euhan's Tesla crashed into a tree on the side of the road and burst into flames. Mr. Rossiter managed to escape and called emergency services, but the driver, Mr. von Oyhan, was confirmed dead.

If you look at the photo of the Tesla car that actually caused the accident below, you can see that it is torn apart due to the collision and flames. Colorado State Police Sergeant Robert Madden, who responded to the scene, said the crash that killed von Oehan was one of the most intense vehicle fires he had ever seen. The cause of the intense fire was the electric car's lithium-ion battery, and Mr von Euhan died from smoke inhalation and burns, the accident investigation report said.

A post-mortem examination found that Mr von Euhan's blood alcohol level was three times the legal limit, indicating that he was intoxicated. However, Rossiter told emergency personnel that he was using Tesla's self-driving feature and the car went straight off the road, and also told the Washington Post that von Euhan at the time was using FSD. He testified that he thought that, and it has been suggested that Tesla's FSD may have been involved in Mr. von Euhan's death. The Washington Post says, 'If true, his death would be the first fatal crash involving Tesla's cutting-edge driver assistance technology (FSD).'

According to Mr. Madden, there were no signs that the brakes were pressed at the scene of Mr. von Euhan's accident, and there was evidence that the Tesla continued to supply power to the wheels even after colliding with the tree. 'The behavior at the time of the crash and the evidence that the vehicle left the road with no evidence of sudden maneuvering are consistent with the idea that driver assistance features were being used,' Madden said.

Tesla car owners often report dangerous behavior from their vehicles' driver assistance systems, with 900 accidents involving Tesla vehicles since 2021, when U.S. regulators required reporting of accidents involving driver assistance systems. The above has been reported. Tesla has rolled out FSD to about 400,000 users, but FSD is still in beta development and there are situations where FSD may not work well, such as narrow roads with oncoming traffic or winding roads. A list is provided in the user manual. It also claims that drivers must remain in control of their vehicles even when FSD is enabled, and that they are not responsible for distracted or drunk driving.

In 2023, the beta version of FSD will be subject to dangerous road behavior, such as ignoring stop signs and entering intersections without due caution when a yellow light is on. A recall has been announced.

Recall announced due to risks found in fully automatic driving beta version of approximately 360,000 Tesla cars - GIGAZINE

Tesla has reported Mr. von Euhan's accident to the United States Department of Transportation 's National Highway Traffic Safety Administration (NHTSA) . However, although they were able to confirm that driver assistance functions were being used at least 30 seconds before the collision, the fire was so intense that it was unclear if it was FSD or whether it was FSD or not, depending on traffic conditions, such as vehicle speed and steering control within the lane. It was not possible to confirm whether it was an autopilot assisting the driver.

Mr. von Oyhan, a military veteran, joined Tesla because he was drawn to Mr. Musk and the company's mission to bring electric and self-driving cars to the masses, said his wife, Nora Bass. That's what he said. As a member of a company working on advanced technology, Mr. von Euhan uses FSD every time he gets in a car, and seems to have contributed to Tesla by generating actual driving data.

Bass said: 'Regardless of how drunk Hans was, Elon Musk claims this car can drive itself and is essentially better than a human. We are lulled into a false sense of security. She told the Washington Post that Tesla should bear some responsibility for her husband's death.

Ed Walters, who lectures on self-driving car law at Georgetown University, said von Euhan's drunkenness also played a role in the accident, since alcohol dramatically slows human reaction times. thinking about. 'This driver would normally have been able to get his car back on the road and safely fix the Tesla problem,' Walters said. 'No matter what kind of car you drive or what software you use, be careful. People need to understand that they need to pay, they need to be sober, they need to be careful.”

Meanwhile, Professor Andrew Maynard, an advanced technology researcher at Arizona State University, said reports of frequent unsafe behavior in Tesla cars call into question Tesla's decision to roll out FSD to customers. 'FSD is not yet ready for widespread use,' Maynard said, adding that the value of testing FSD on public roads should be weighed against the risk of deploying it to drivers who overestimate its capabilities. He insisted.

in Software,   Ride, Posted by log1h_ik