An attack method called ``MadRadar'' will be developed that causes an accident by showing a ``phantom oncoming vehicle'' on the radar sensor of an automatic driving car or making a vehicle that should exist invisible.

A group of researchers led by Miroslav Pajic and Tingjun Chen, associate professors of electrical and computer engineering at Duke University's

Pratt School of Engineering , has developed MadRadar , an attack that tricks radar sensors in self-driving cars. . The research group says that techniques like MadRadar to steal self-driving cars may become a common plot seen in Hollywood movies for decades to come.

Engineers Develop Hack to Make Automotive Radar Hallucinate | Duke Pratt School of Engineering

MadRadar hack can make self-driving cars 'hallucinate' imaginary vehicles and veer dangerously off course | Live Science

MadRadar, developed by Mr. Pajic and others, tricks the radar sensors installed in self-driving cars, hiding the fact that other cars are approaching, and causing ``phantom cars'' to appear in places where none exist. This is a method of attacking self-driving cars that can create the illusion that The research team has successfully demonstrated MadRadar's proof of concept.

MadRadar Fools Automotive Sensors into Dangerous Situations - YouTube

A team of engineers at Duke University is developing an attack method called ``MadRadar'' that deceives radar sensors installed in cars that can use functions such as cruise control .

Modern electric cars with autonomous driving systems, as well as cars with close control functions, are equipped with radar sensors to detect the environment in front of and around the vehicle. These radar sensors may also be used to supplement camera systems installed in vehicles.

However, since multiple vehicles use radar sensors on highways, even vehicles of the same make and model do not have exactly the same operating parameters. This seems to be because the vehicles use radars with slightly different frequencies or take measurements at slightly different intervals. Therefore, MadRadar needed to know the specific parameters used by the target vehicle. Regarding this, Mr. Pajic said, ``In order to block or hijack a signal, you first need to understand which signal the other party is using.''

MadRadar, developed by researchers at Duke University, can accurately detect this parameter in less than a quarter of a second. After that, MadRadar can send a specific signal to the radar sensor of the target vehicle, causing it to go awry. For example, it is possible to falsely detect another car as approaching when it should be moving away.

Even if both the target car and surrounding cars are moving, MadRadar can disrupt the vehicle's radar sensor. Radar sensors will falsely detect surrounding vehicles until the attack is over.

MadRadar attacks can be carried out without prior knowledge of the target vehicle's radar parameters. Therefore, the research team argued that ``severe safety measures must be taken with commercial vehicle radar systems.''

'Even if you don't know much about the target car's radar system, in real-world experiments you can make phantom cars appear out of nowhere and misidentify them, or detect cars that are supposed to be there,' said MadRadar developer Pajic. We are not developing MadRadar to hurt anyone, but rather to examine the problems that exist in existing radar systems and realize that their design needs to be fundamentally changed. It's proven,' he said.

In addition, ``Imagine a case where active cruise control, which uses radar sensors, increases the speed of the vehicle by thinking that the vehicle in front is speeding up, when in fact the speed has not changed at all.'' 'If this was done at night, the collision would have occurred before the vehicle's camera could recognize the vehicle in front of it.' is suing.

The details of MadRadar will be announced at the Network and Distributed System Security (NDSS) Symposium, which will be held from February 26, 2024.

in Software,   Ride,   Video,   Security, Posted by logu_ii