Threat to privacy: Amazon Echo, Google Home can eavesdrop on conversations

By Xite - October 22, 2019
A team of security researchers have found that third-party apps or skills on Amazon Alexa and Google Home can eavesdrop on users and compromise their privacy. Both the companies claims that they have ....

Amazon, Google, Apple, and Microsoft have two things in common. First, all these tech giants have a virtual assistant and second, all of them have been mired into controversies that are related to these AI-powered assistants. All of them have discussed their versions of the story, and have taken necessary steps to prioritise their users’ privacy. However, in the latest development, it has been found that third-party or custom apps of Amazon Echo and Google Home can eavesdrop on users’ conversations.

The (latest) threat to users’ privacy was disclosed by Luise Frerichs and Fabian Bräunlein, security researchers at Security Research Labs (SRLabs), who shared their findings with ZDNet. According to the duo, both ‘phishing and eavesdropping vectors are exploitable via the backend that Amazon and Google provide to developers of Alexa or Google Home custom apps.’ They also say that Apple’s Homepod is safe in this regard.

‘These backends provide access to functions that developers can use to customize the commands to which a smart assistant responds, and the way the assistant replies,’ ZDNet notes. According to the SRLabs team, by adding the ‘�. ‘ (U+D801, dot, space) character sequence to various locations inside the backend of a normal Alexa/Google Home app, hackers (which may be posing as developers) could generate long periods of silence during which the mic remains active, that is, the device can listen to the conversations.

The most systematic way that an app or skills should work is to just wait for a command and the mic should turn off after certain seconds if there’s no communication from the user. With a hack introduced, the mic can listen for a longer duration. This essentially means that the device is listening to users' conversation when they think it isn’t. You can see how this works in the videos below.

SRLabs team claims that they had reported the issue to both Amazon and Google earlier this year, but they haven’t fixed the issue till now. ‘Finding and banning unexpected behavior such as long pauses should be relatively straight-forward. We are surprised that this hasn't happened since reporting the vulnerabilities several months ago,’ the team was quoted as saying. Meanwhile, Google and Amazon have responded to the development saying that corrective actions have already been taken.

‘All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies. We have review processes to detect the type of behavior described in this report, and we removed the Actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future,’ Google was quoted as saying.

Google also clarified that Google Home will never ask for the account password. Similarly, Amazon said that its devices would never ask for a user's password, and that they have ‘put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified.’

  • Tags
  • Google Assistant
  • Amazon Alexa
  • Echo
  • Google Home