Research Finds Alexa, Siri and Google Assistant Susceptible to Inaudible Commands

Share

The New York Times reports that researchers in China and the USA have discovered a way to surreptitiously activate and command those virtual assistants by broadcasting instructions that are inaudible to the human ear. Researchers from Berkeley have said that they can even alter audio files "to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being almost undetectable to the human ear". While at present this is strictly an academic exercise, researchers at the university say it's foolish to assume hackers won't discover the same methods as well.

Nicholas Carlini, a fifth-year PhD student at UC Berkeley and one of the co-authors of the paper, said that the team just wanted to see if they could make the previously demonstrated exploit even more stealthy.

This theory was put to practice past year when Chinese researchers created a device from off the shelf parts to send such inaudible commands to virtual assistants.

Researchers in China call the technique, "Dolphin Attack".

Europe Keeps Buying Iran Oil, But Banks May Hinder Trade
Iran is OPEC's third-largest producer with an output of about 3.8m barrels per day, making up around 4% of global oil supply. Sputnik: And how would the market react to Iran's decision to export oil?

Amazon says they have taken steps (but haven't told anyone what steps) to make sure Alexa is secure. Google said its platform has features that mitigate such commands. Apple says the HomePod is programmed to not perform certain tasks such as unlocking a door, while they insist Siri on the iPhone and iPad is safe since the device has to be unlocked in order to execute such commands. With nearly all virtual assistants getting more features, its time we address the inherent security loopholes they open up.

There is no USA law against broadcasting subliminal messages to humans, let alone machines.

"The song carrying the command could spread through radio, TV or even any media player installed in portable devices like smartphones, potentially impacting millions of users in long distance", the researchers wrote. The receiver must be close to the device, but a more powerful ultrasonic transmitter can help increase the effective range.

The commands aren't discernible to humans, but will be gobbled up by the Echo or Home speakers, the research suggests. They were able to embed commands directly into recordings of music or spoken text.

Android P 9.0 Everything you need to know and supported devices
Google's VP of engineering said in an interview with CNET that the Android team has been working nonstop on Android P for a year. When you lift your finger up, the application card in front of the screen opens up and takes the full screen mode.

They also embedded other commands into music clips.

"Companies have to ensure user-friendliness of their devices, because that's their major selling point", said Tavish Vaidya, a researcher at Georgetown. He wrote one of the first papers on audio attacks, which he titled "Cocaine Noodles" because devices interpreted the phrase "cocaine noodles" as "OK, Google".

But Carlini explained their goal is to flag the security problem - and then try to fix it.

Walmart-Flipkart deal to hit India's retail sector: Traders' body
For most employees, the larger question is how this acquisition is going to settle things down in Flipkart . Accel India operates under US-based Accel's banner but is managed and run independently.

Share