Researchers teach Google and Amazon smart speakers to spy on users

So that's, ya know, terrifying

Whitehats at Germany's Security Research Labs have successfully developed and deployed malicious apps for Google and Amazon's smart speakers.

As originally reported by Ars Technica,  Security Research Labs developed eight apps in total: four Alexa Skills and four Google Assistant actions. The exact purposes and function of each app varied but all eight managed to make it through each platform's approval processes - which should concern pretty much anyone who owns a smart speaker.

You can check out some of the videos below for an example of these exploits in action:

According to a blog post by Security Research Labs, "as the functionality of smart speakers grows so too does the attack surface for hackers to exploit them."

"SRLabs research found two possible hacking scenarios that apply to both Amazon Alexa and Google Home. The flaws allow a hacker to phish for sensitive information and eavesdrop on users. We created voice applications to demonstrate both hacks on both device platforms, turning the assistants into ‘Smart Spies’."

Of course, if you don't dabble with things like Alexa Skills or the Google Assistant App store, you are theoretically safe - since you would need to pre-approve any third-party app before it can do anything with your smart speaker. But considering how important a role that the third-party ecosystem plays for the wider functionality of these products, it's an especially concerning development to see the vulnerability of these voice assistant platforms exposed in such a way. 

We've written about the privacy concerns that smart speakers like Google Home and Amazon Echo raise before but this is probably one of the first public examples of hackers successfully exploiting the way that these products work in a way that's really concerning. 

Ultimately, SRLabs say that "users need to be more aware of the potential of malicious voice apps that abuse their smart speakers. Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone."

They also say that Amazon and Google need to implement better protections for end users, specifically a more thorough review process of third-party Skills and Actions.

Google say that they "are putting additional mechanisms in place to prevent these issues from occurring in the future.”

“All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies. We have review processes to detect the type of behavior described in this report, and we removed the Actions that we found from these researchers."

PC World has reached out to Amazon for comment.

You can find out more info on the SRLabs website here.