The original article dated May 10, 2018 is titled "Alexa and Siri Can Hear This Hidden Command --- You Can't". According to the reporter (Craig Smith), a group of students from UC Berkeley, and Georgetown Universities, back in 2016, demonstrated that they could hide sounds, inaudible to human ears, in white noise played over loudspeakers or YouTube video soundtracks. However even though humans couldn't hear them, smart devices (??!!??) would interpret them as voice commands.
Of course it's still early days. But as vendors compete to make their devices more "user-friendly", let's hope they don't forget about security! One of the researchers, Tavish Vaidya, from Georgetown, wrote one of the first papers on audio attacks, with the title "Cocaine Noodles". And the reason for such a disturbing title? It's because the phrase "Cocaine Needles" has been interpreted by Google Assistant as "O.K., Google." ... Perhaps your humble blogger is safe, because he has an Aussie accent?
Craig goes on to claim that since this original proof-of-concept, researchers in China and the United States have been improving on it. And have demonstrated that they can compose snippets of sound that are undetectable to human ears. And yet, nevertheless, Apple's Siri, Amazon's Alexa and Google's Assistant will interpret these sounds as commands.
To quote from the article:
This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon's Echo speaker might hear an instruction to add something to your shopping list.
Perhaps dear reader, you should think twice about putting voice-activated payment systems on any of your user-friendly devices?