[Lab] Messing with Voice Assistants
jason.cobill at gmail.com
Mon Jun 11 12:01:03 EDT 2018
Hey everyone - here's an interesting hack for you to think about. More
and more people are using voice interfaces ("Hey Siri!") to accomplish
small tasks while their hands are busy - as many as 50% of Americans use
digital assistants (Including Alexa and Google) regularly, mostly while
driving but also while cooking, exercising and hanging out with friends.
These digital assistants have a lot of access to your data and
applications so they can help you, and it turns out it's easy to make
someone else's voice assistant call a number, visit a website, have it
share your history or add/delete contacts just by asking it to, potentially
giving up your private information to a 3rd party.
So how do you get someone's voice assistant to give up information
without getting caught?
Here's a demo of a voice attack where researchers garbled up the noise
enough to be garbage to humans but still understandable by Google's voice
recognition algorithm. You could imagine this being played in a busy bus
terminal or a train station and it would sound like part of the hubub.
*Obscured Voice Commands*
It gets scarier - here's a demo of voice commands being played through
ultrasonic speakers - totally imperceptible to humans but audible to a
*Inaudible Voice Commands*
Even worse: You can embed these high-frequency phone commands into audio
media - so some malicious DJ (evil Taylor Swift!) could control your phone
through a high-frequency audio layer in the music track.
*"The Dolphin Attack"*
Spooky weird stuff. :) How do you protect your data from hypersonic
attacks you can't hear?
I think this would make a very engaging Maker Faire demo! Not sure about
the legality of hijacking people's phones for educational purposes though!
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Lab