Research Finds Alexa, Siri and Google Assistant Susceptible to Inaudible Commands

Share

The New York Times reports that researchers in China and the USA have discovered a way to surreptitiously activate and command those virtual assistants by broadcasting instructions that are inaudible to the human ear. Researchers from Berkeley have said that they can even alter audio files "to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being almost undetectable to the human ear". While at present this is strictly an academic exercise, researchers at the university say it's foolish to assume hackers won't discover the same methods as well.

Nicholas Carlini, a fifth-year PhD student at UC Berkeley and one of the co-authors of the paper, said that the team just wanted to see if they could make the previously demonstrated exploit even more stealthy.

This theory was put to practice past year when Chinese researchers created a device from off the shelf parts to send such inaudible commands to virtual assistants.

Researchers in China call the technique, "Dolphin Attack".

Nashville Predators vs. Winnipeg Jets - 5/10/18 NHL Pick, Odds, and Prediction
"Everyone expected it when two of the top teams from the league are playing each other", Jets center Paul Stastny said . Rinne played one of his best games of this series against the Jets, after being pulled in the last game at home .

Amazon says they have taken steps (but haven't told anyone what steps) to make sure Alexa is secure. Google said its platform has features that mitigate such commands. Apple says the HomePod is programmed to not perform certain tasks such as unlocking a door, while they insist Siri on the iPhone and iPad is safe since the device has to be unlocked in order to execute such commands. With nearly all virtual assistants getting more features, its time we address the inherent security loopholes they open up.

There is no USA law against broadcasting subliminal messages to humans, let alone machines.

"The song carrying the command could spread through radio, TV or even any media player installed in portable devices like smartphones, potentially impacting millions of users in long distance", the researchers wrote. The receiver must be close to the device, but a more powerful ultrasonic transmitter can help increase the effective range.

The commands aren't discernible to humans, but will be gobbled up by the Echo or Home speakers, the research suggests. They were able to embed commands directly into recordings of music or spoken text.

San Diego Shores Light Up With Bright Blue Bioluminescence
On this occasion, it is caused by a bloom of phytoplankton brought by the natural phenomenon known as the red tide. Bioluminescence from a red tide turns breaking waves a stunning, yet eerie shade of blue.

They also embedded other commands into music clips.

"Companies have to ensure user-friendliness of their devices, because that's their major selling point", said Tavish Vaidya, a researcher at Georgetown. He wrote one of the first papers on audio attacks, which he titled "Cocaine Noodles" because devices interpreted the phrase "cocaine noodles" as "OK, Google".

But Carlini explained their goal is to flag the security problem - and then try to fix it.

English Premier League: Saints' win in Wales sends West Brom down
Given Swansea are at home to Stoke and Southampton are hosting Manchester City this is not out of the question. Huddersfield Town will play at Chelsea tomorrow in their 37th match of the season.

Share