Are Hackers Using Popular Assistant Devices To Listen To Users?
Posted on January 7, 2020 by Rick Hulse
Unfortunately, they’re prone to abuse and exploits by hackers and unsavory developers. They can be used to spy on and even steal sensitive information from unsuspecting users.
This is not new in and of itself. Security researchers around the world have, at various points over the last couple of years, sounded the alarm about weaknesses and exploits. To the credit of both companies, any time this has happened, both Amazon and Google have responded promptly, plugging gaps and shoring up the security of their devices.
The utility of virtual assistants like Amazon’s Alexa and Google Home are undeniable. They’re just genuinely handy devices to have around.
Unfortunately, every few months or so, new exploits are discovered. The two companies are essentially playing Whack-A-Mole with security flaws, which appear to have no end.
Recently, security experts published two videos, one for Alexa and one for Google Home. Each demonstrated a simple back-end exploit that anyone with a DevKit could employ. The exploits revolve around inserting a question character (U+D801, dot, space) to various locations in the code. Then they introduce a long pause during which the assistant remains active and listening.
To give you an idea of how this could be exploited, one of the example videos shows a horoscope app triggering an error, but the presence of the special character introduces a long pause during which the app is still active.
During the long pause, the app asks the user for their Amazon/Google password while faking a convincing looking update message from Amazon or Google itself. Given the long pause, few users associate the poisoned horoscope app with the password request. It seems like it’s coming from the device itself.
It’s both sneaky and troublesome, and worst of all, even when both companies move to address this issue. By this time next month if history is a guide, there will be others. We’re not saying not to use them, but when you do, be very mindful.