Not only are they in your bedroom, home office, car and smartphone, but they’re in our watches, phones, earbuds, and even our microwaves and refrigerators. Voice assistants, which allow us to gain information just by using our voice, are inescapable.
That doesn’t make them evil, but it means its absolutely vital to know any and all associated risks. You know not to give out your email password or conduct top-secret work on public WiFi, but do you know exactly which kinds of data the voice assistants around you are collecting? Do you know how to delete it?
We’ve rounded up a practical guide to the privacy and security concerns you should take into consideration pretty much everywhere you go—even if you don’t own a device with a voice assistant (though…you probably do).
If it’s impossible to get away from devices that record audio, it’s important to understand what could go wrong, in theory, to make the best choices possible to protect yourself.
This month, German security consulting firm Security Research Labs discovered a serious flaw in voice assistant security: voice data on your Google Home and Alexa devices can be hacked into through third-party apps, or skills.
Researchers at the company created eight dummy apps for the platforms to illustrate possible hacking scenarios. Most of these were horoscope skills that seemed fairly innocuous. However, developers sometimes include up to a full minute of silence after users think the app has stopped running. That means anything said near the speaker at that time will be recorded without the user’s knowledge and sent to the developer.
In other even more malicious cases, a skill may tell a user that an update is ready and that Alexa or Google Assistant needs to hear the user’s password to install it. These are phishing attempts to get user passwords and not legitimate asks from Amazon or Google.
“Users need to be more aware of the potential of malicious voice apps that abuse their smart speakers,” the authors of the study wrote. “Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone.”
Charles Henderson, Global Partner and Head of IBM X-Force Red, told Popular Mechanics that while the early adopters of voice assistants faced the most tumultuous waters in terms of privacy, that we still aren’t off the hook.
Henderson runs a team of hackers at IBM, more or less. Their job is to think like criminals and break things so that they can help clients understand how to build infrastructure that will keep them safe from adversaries that may want to compromise a system or install malware on a device, like a voice assistant, for example.
Luckily, he said, he hasn’t seen malware invade dedicated voice assistant devices like the Amazon Echo or the Google Home, but malware infects phones all the time—and they do have voice capturing capabilities and embedded voice assistants.
“You have to think about…it’s really tough to get away from voice assistants today,” Henderson said. “They’re sort of omnipresent.”
It’s not worth worrying about constantly or you’ll drive yourself insane, he said, but consumers should be making informed decisions about how they use and trust their voice assistants.
Context is key, Henderson said.
Imagine in one scenario that you have a voice assistant in your home’s living room and a five-year-old child is asking it a bunch of questions. Henderson’s own son did this with questions about how fast The Flash from DC Comics actually is in “real life,” he said with a laugh.
Those questions aren’t as sensitive as, say, merger conversations that a worker may be having from a home office. In that case, it’s not a great idea to have the voice assistant set up in that room.
“Context is important in privacy and security and consumers can serve themselves well by understanding their own context,” Henderson said.
Companies like Google and Amazon are trying to implement privacy controls, but no one is perfect.
Typically, voice assistants only send your voice data to servers once it hears the wake word you’ve set. That’s typically “Alexa” or “Hey, Google.” The device must always listen, so it can detect the wake word. But those conversations are not sent to company servers, Henderson explained.
Still, the market has demanded more privacy options, which is why there are now slide covers for cameras on voice assistants with a camera, or a mute function on many Google Home smart speakers.
“That’s a big improvement to the market…you’re seeing the voice assistant companies becoming cognizant to privacy,” Henderson said. “They’re starting to see consumers question [the devices]. And as a hacker, I’ve got to tell you that questions are good.”
Voice companies are also upgrading their software to make it easier for you to understand the data you’ve forked over and how to delete it, if desired.
In September, Google debuted a new privacy setting for its namesake Google Assistant, allowing users to delete data with a simple voice command. Alexa has this option, too, though not by default.
If you want to delete any information your home assistant has gathered over the months/years, here’s how to do it:
Amazon Echo: Go to your Alexa app and then find Settings > Alexa Privacy or visit this Amazon site to review your voice transcripts, listen to the voice recordings Alexa has sent to Amazon’s servers, and delete whatever you’d like—file by file—or all at once.
For reference, here is what my part of my transcript looks like on a laptop:
You can also set up voice deletion, which is not a default setting. It allows you to ask Alexa to delete what you just said, or ask her to delete everything you said that day.
Amazon is careful to note that deleting your voice files may make Alexa less responsive to your demands.
Google Home: Google is sort of its own animal, given that its voice assistant is built into many Android smartphones by default. To ensure audio recording is turned off on your phone or tablet, open Settings > Google Account > tap Data & Personalization at the top > Under Activity Controls, select Web & App Activity and then check or uncheck the box next to “include voice and audio recordings” to turn the setting off or on.
Remember that based on other settings that control your phone’s microphone, audio recordings may be taken and stored in other places in your phone, but will not be sent to Google’s servers.
To review your voice transcripts and recordings from Google Assistant, navigate to your Google Account and make sure you’re logged in. Then, on the left navigation panel, click Data & Personalization > Activity controls > Web & App Activity Manage Activity. From there, you can take a look at your past activity and any files with an audio icon will include a recording.
To listen to those recordings, click Details (next to the audio icon) > Show Recording to play. If for some reason you get an error message that reads “transcript not available,” that means either your microphone was turned off or there was too much background noise.
Plus, Google is soon rolling out voice controls to delete data just by asking the Assistant, similar to Amazon’s Alexa.
Apple’s Siri: You’ve got two options when it comes to Apple’s voice assistant, which is embedded in its various iPhone models. If you’re worried about voice recordings, skip to option two.
1) Clear all of Siri’s search history, but without deactivating search entirely:
Quit Safari on your iPhone if you currently have it open then go to Settings > Scroll down and select Safari > click Clear History.
2) Clearing the Siri search history, plus deactivating Siri entirely:
Go to Settings > General > scroll down to Siri > tap on Siri to deactivate it > return to Settings > Select General again > tap on Keyboard > disable the Dictation option.
Disabling Siri will clear your voice search history as well.
If you’re still a bit squeamish about them, that’s perfectly fine.
For those split down the middle on whether or not to buy one and introduce it to your home, try easing into it by putting it in a room where the least important conversations happen, Henderson said.
However, “if you are sitting up at night worrying about if your voice assistant is listening to you, maybe voice assistants aren’t for you,” he said. You still should follow best practices for privacy, though, because other people around you will inevitably have at least one voice assistant in their home or on their person.
How do you get around that? Henderson says he’s taken on a specific tactic.
“I adopt what I call the ‘coffee shop policy’ [that I use] for every phone call,” he said. “I pretend I’m in a coffee shop and don’t say anything I wouldn’t want people in the coffee shop to hear.”
But its best to become fluent in privacy tools around voice assistants, because they’re coming whether you like it or not.