Alexa, Siri, Google Assistant and the implications of an Always-On assistant.

The Internet of Things (IoT) and the growth of smart devices are designed to make life easier. Instead of having to go to your computer or phone to look something up, you can just call out your question and Alexa or Siri or Google Assistant will answer you.

But have you ever really thought about how these home assistants know when you’re talking to them? In order to do so, they have to be listening to all the time, meaning that the devices, the servers that process the data, and the companies that build and operate these devices are listening 24/7.

Advertisement - Continue reading below

This can be a major issue, but it may be fine if the companies take the proper steps to protect your sensitive information. At a minimum, they should apply anonymization to the collected records, protect them properly, and destroy them as soon as the request is complete. It would also be great if requests are only managed by machines, with no humans in the loop.

But what if they don’t? The number of data breaches that have occurred in recent years have demonstrated that companies have a very hard time protecting sensitive data. And recent news reports have demonstrated that these smart home companies have no intention of meeting these standards.

Amazon Is Listening

Amazon’s Alexa frequently features in the headlines about the security and privacy implications of the smart home. Amazon had already revealed that human operators have access to the recordings made by Alexa. Since Alexa is “always-on”, this is like having an Amazon employee eavesdropping on every conversation that you have within hearing distance of Alexa. In fact, US employees regularly listen to as many as a thousand Alexa audio clips a day as part of their effort to improve Alexa. Access to the recordings is not limited to Amazon employees, the company regularly uses contractors to review recordings as well.

The fact that Amazon allows human operators to hear your Alexa requests is bad enough, but Amazon’s record deletion policy makes things even worse. Amazon has recently stated that voice records from Alexa are kept indefinitely if the user does not manually delete them. Even then, Amazon reserves the right to retain transcripts of the interactions, meaning that Amazon can always know what you’ve searched for with Alexa.

And access to these voice records retained by Amazon may not even be limited to Amazon employees. Amazon’s policy for sharing data with third parties is unclear, so data may be intentionally shared outside the company. And this doesn’t even include glitches in Amazon’s systems where Alexa voice recordings may (and has) been unintentionally shared with other Alexa customers. With a smart home assistant, you not only have someone eavesdropping on you whenever you’re nearby, but you also have no idea who they’re sharing that information with.

And It’s Not Just Them

Amazon’s Alexa may get the most headlines for data privacy concerns, but it’s certainly not the only one out there. While Amazon is blatant about how it uses and stores your data, all voice assistants are “always on”, and all of the organizations developing them probably have a development and improvement program where voice recordings from the devices are reviewed by humans to help identify and correct any issues with the system.

Advertisement - Continue reading below

In fact, Google has confirmed that recordings made by its Home Assistant are available to contractors to help with the improvement of the device’s artificial intelligence algorithms. This admission came after over a thousand Dutch voice records were publicly leaked. Over a hundred of these clips were captured accidentally, meaning that the owners did not know that they were being recorded and certainly didn’t know that these recordings were being used and subjected to human review to improve the Home Assistant software.

Google reports that they make an effort to protect the privacy of owners by anonymizing the data. However, their anonymization efforts have been insufficient. The leaked audio recordings included the full address of one user and personal information (children’s names, significant others, etc.) of several others. This sensitive data, in addition to anything that can be extracted from the voiceprint itself, maybe enough to de-anonymize this collected and leaked data.

Privacy Implications of the Smart Home

Smart home technology is extremely convenient; however, often this convenience comes at the cost of privacy. These devices are designed to be constantly listening to their owners in order to be able to respond to the key phrases (Alexa, OK Google, etc.) indicating that the owner has a task for them.

Owners of these devices have to trust the manufacturer not to record or listen to any conversations not intentionally directed at the device. And a breached dataset of voice recordings from Google show that this isn’t the case. Not to mention the fact that smart home manufacturers regularly have humans listen to these voice recordings and that Amazon admits that they have no intention of implementing a default data retention deadline for recordings (and may even retain transcripts of user-deleted recordings).

Smart assistants represent a clear tradeoff between convenience and user privacy and security. The massive datasets collected by these devices are treasure troves of personal data for hackers. Access to a user’s recordings likely would give a hacker everything that they needed to commit credit card fraud, launch a spear-phishing campaign or any of a number of other attacks. Home assistants also threaten businesses’ cybersecurity as recordings are made of employees discussing business matters in the home. It’s vital to consider the security implications whenever talking within earshot of one of these devices.

LIKE WHAT YOU ARE READING?

Sign up to our Newsletter for expert advice and tips of how to get the most out of your Tech Gadgets

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.