What is the effect of "false detection" on two deaths caused by automatic operation?


byJakob Härter

In March 2018, two traffic accidents related to the "automatic driving" of the car occurred. One is an accident in which the Tesla Model X accidentally collided with the median strip and the driver died, and the other is a death accident where Uber's automated driving car hit a woman crossing the road. If "false detection" is largely involved in these accidents,HackadayPointed out.

Fatalities vs. False Positives: The Lessons from the Tesla and Uber Crashes | Hackaday
https://hackaday.com/2018/06/18/fatalities-vs-false-positives-the-lessons-from-the-tesla-and-uber-crashes/

Accident of Tesla Model X occurred on March 23, 2018. The vehicle that was traveling on the highway using the autopilot function collided with the lane separation band, and the driver died. Experiments using different vehicles have demonstrated that the autopilot function chases the white line heading for the exit of the expressway and as a result the vehicle may be guided to the separation zone of the lane.

It turns out that the accident of Tesla Model X where the driver died occurred in the same situation as in the past case - GIGAZINE


On the highwayConcrete protective fenceThere are many stationary objects such as road signs and road signs. Naturally, the front monitoring radar installed in the vehicle detects these objects, but on the other hand, the simple radarAngular resolutionIs "low" and it will "false detect" the signs that will not collide, so if you emphasize radar response, the autopilot function may step on the brake each time there is a sign. For this reason, Tesla realized running at a stable speed by focusing on the camera, resulting in "overlooking" the separation band.

Accident caused by Uber's automatic driving car occurred on March 18, 2018.

Uber's automated driving car crashes a woman and kills an accident, Uber ceased road test on automatic driving - GIGAZINE


In this accident, it was installed in the carLIDAR system, I know that it detected "a bicycle or an unknown vehicle that may change its course in the future" 6 seconds before the accident. Still, the reason why the emergency brake did not work is that LIDAR was also subject to "erroneous detection" like Tesla's radar and applied a meaningless emergency brake,Emergency brake system was disabledIt was from.

In tuning of the detection algorithm, for example, in the case of "detecting a bicycle", if the threshold value of the "bicycle" is set too low, it is detected that even a bicycle is not actually taken, causing unnecessary braking, It will be something you can not do. On the other hand, if you set the threshold too high, you will miss the bicycle you should detect and increase the risk of accidents.

In the future, as the system improves steadily, the detection capability should improve rapidly, but it is considered to be very difficult to reach "to zero false detection".

It is impossible for complete autonomous operation even if it is possible to assist the assistant of the driver to the end, or is there some manufacturer who overcomes this problem in some way? The prospect of the 2020 in JapanLaw will be developed for the introduction of automatic driving carsIt is important to be aware of the movement in this field.

in Software,   Ride, Posted by logc_nt