It is possible to command Alexa, Google Assistant, Siri with sounds that can not be heard by humans


byRecklessstudios

The number of people who use speaker type terminals such as Amazon Echo and Google Home is increasing, and voice recognition AI is familiar to people's daily life. Meanwhile, over the past two years, research teams in the US and China have shown that it is possible to "instruct smart speakers without being perceived by users" and have developed that technology. With this technology you can quietly activate AI on smartphones and smart speakers, make phone calls and open certain websites. Also, if it can be exploited, it will be possible to open the door of someone's house merely by "making certain music", or to do shopping with himself without permission on the net.

Alexa and Siri Can Hear This Hidden Command. You Can not. - The New York Times
https://www.nytimes.com/2018/05/10/technology/alexa-siri-hidden-command-audio-attacks.html

The 2016 study at the University of California, Berkeley and Georgetown University showed that "hidden commands" can be sent to smart devices by playing white noise through loudspeakers and YouTube movies. You can open a website by "hidden command" or change the terminal to airplane mode.

It is possible to "fool" voice recognition AI by applying special processing to voice data - GIGAZINE


In May 2018, the researchers have announced contents further advanced of the above research. According to the research Nicholas Carlini told the New York Times he said that he succeeded in embedding "hidden command" in something like "recording of human conversation" or "recording of orchestra" rather than white noise Thing. In other words, it is possible to add items to Amazon's shopping list without knowing to the owner of the terminal just by sounding the sound.

This technology utilizes the gap between human and machine speech recognition. Normally, speech recognition AI recognizes commands by converting sounds into letters and combining them into words or phrases. At this time, it is possible to embed hidden commands by replacing "sound" to be recognized by AI with "sound" which has been modified so that it can not be recognized by humans. Carlini believes that this technology in the research phase as of May 2018 is also a matter of time to be used to exploit someone.

Also in 2017 Burger King sent voice commands "OK, Google" in the TV commercial, and we tried to let Google Assistant explain the Wapper Burger. Later Google changed the specifications of Google Assistant and the command was invalidated, which is an example that "companies tried to raise profits using voice recognition devices" It is secret by companies pursuing profits It is also possible that smart speakers will be used.

Burger King makes Google Assistant explain the Wapper with the TV commercial triggered too Google furious immediately disable trigger - GIGAZINE


For example, an animation "South Park"In a scene where a cartoon man, who is one of the characters, adds" disgusting "items to the shopping list, the scene was operated with voice, and the smart speaker of the person actually watching the animationIn the situation of reacting. Amazon Echo reacts to the operation "ring an alarm at 7 AM"A lot of alarms have been setThere seemed to be some people.


Against humansSubliminal effectAlthough it is forbidden to use in movies, television broadcasting, etc., it is not legally determined yet what happens when you issue a subliminal message to the machine. Since it takes time to establish legislation and the speed of development of technology is faster than that, there is concern that technology will be unleashed until the law is clearly enacted.

In 2017, researchers at Princeton University in the USA and Zhejiang University in China announced the technology called "Dolphin Attack" that hacks voice assistants using frequencies that can not reach the human ear. In this attack, the terminal is changed to mute state first, so even if the terminal makes a reply, the user can not recognize it.

Skillfully manipulating sounds that can not be heard by humans and hacking voice assistants such as Siri What is "Dolphin Attack"? - GIGAZINE


In addition, the University of Illinois Urbana-Champaign succeeded in performing ultrasonic attacks from a distance of 25 feet (about 7.6 meters). Ultrasonic waves can not cross the wall, but if there is a window it will be possible to operate the smart device from the outside of the building.

Smart devices are accepted by a lot of people, Amazon by voice assistant speakers by 2020It will sell 10 billion dollars (about 1 trillion yen)It is predicted. Amazon, Google, and Apple companies are focusing on security to keep smart speakers safe, and Google says it is also taking measures against "human-unrecognized commands" as described above. Therefore, Amazon and Google's terminal only reacts to the voice of the user about a specific command. In addition, Apple adopts a mechanism that Siri must unlock iPhone or iPad in order to respond to personal data and commands to access the application, which is also a terminal from an unexpected command It can be said that it is a mechanism to protect users and users.

The technique of "attacking voice recognition AI using commands not recognizable to humans" is relatively new, and we do not yet know what kind of deployment will be in the future. Also, the correspondence of each company developing the device also depends on the type and contents of the terminal. Carlini at the University of California, Berkeley said, "We want to show that such technology is possible, and someone says," That's possible, let's fix it. " I am expecting it. "

in Software,   Security, Posted by darkhorse_log