Someone May be Listening To You Through Your Smart Speaker
The best way to protect your privacy with smart speakers is not to have one, because when you speak to it a stranger might be listening in.
When you speak to your smart speaker, a stranger may be listening as well, according to a Bloomberg report, Amazon has thousands of employees dedicated to listening to the audio files from your smart speakers and using that audio to improve its Alexa software.
Should we really be surprised?
Like I tell my friends and family, if you care about your privacy, you shouldn't have a smart speaker in your home. Your spoken interactions are not just between you and the device, strangers could be listening and analyzing a recording of your conversations with your smart speaker. What makes things worse is some recordings aren't even dialogue between you and your device. Sometimes, recordings are conversations between spouses, a television, or someone singing in the shower, they have even recorded crimes before which has been used as evidence.
How can this happen? Amazon claims they only record and store audio after a "wake phrase" is spoken to the device, but it's not so simple. These devices are always listening to their wake phrase to turn on and start responding to queries from the user. However, these devices aren't perfect, so they might think you said, "Hey Alexa," but you were actually speaking to your friend Alex. Not only that, but sometimes the device will wake up for no apparent reason a begin to recording. What the Echo records is then sent back to Amazon. Once there, actual people will pour through hundreds of thousands of audio files to help improve Alexa's functionality.
A dedicated team listens to thousands of recordings a day, annotates conversations, notes other sounds being picked up by the speaker, and records any information Amazon deems relevant. And it's not just one person who listening to each recording. Particularly amusing or entertaining audio files are often shared amongst Amazon workers. Are you comfortable with strangers across the country potentially hearing everything you say while in your home?
Amazon isn't unique. Any company offering a virtual assistant has the same functionality improvement process. AI and machine learning need constant fine tuning from an actual human. Any interaction with Siri is going to Apple, and any interaction with Cortana is going to Microsoft. All voice recognition software is imperfect, so all voice recognition software will pick up audio it wasn't meant to hear. That means even your iPhone might start listening to you without your prompt. Be careful talking about anything controversial around your smart speaker. Thankfully, companies allow users to opt out of their audio data being used to improve products, but Amazon has stated users might still have their audio manually analyzed even if they've opted out.
So much for the power of the consumer.
Adding fuel to the fire, Amazon doesn't completely anonymize the audio it receives and each audio file includes an account number, device serial number, and the user's first name. The Bloomberg story reported a worker listening to a conversation containing banking information. It wouldn't take much for a disgruntled employee to look up the purchase information and gain access to that user's bank account. Other sensitive and personal recordings were reported to have been heard. Intimate conversations, arguments, even children playing are all conversations that should stay in the home.
What Are The Privacy Implications?
First off, someone could be always listening to any interaction within earshot of a smart speaker or virtual assistant. Are you comfortable with knowing anything you say might be heard by someone you don't even know? Every individual has a right to privacy, not just from the government.
Secondly, where and how is this information stored? Amazon has not said how long they store audio data, but my guess would be years. Many of these audio files contain very sensitive or personal information. The article claims recordings with sensitive info have a "critical info" special tag applied to them, but are they stored separately from all the other audio files? The potential impact of a data breach is in-calculable. If the audio files were to make it onto the open web, hackers would have a field day.
Lastly, how do we know Amazon will not take further action on recordings potentially containing crimes or Amazon's version of "harmful content?" They've been asked before by the government to hand over audio files. It's not a stretch to think Amazon won't continue to do so in the future. What happens when an over-zealous worker thinks he/she is hearing a crime and decides to report it to the police? Everyone has the right against incriminating themselves; shouldn't that also apply to every individual's devices as well?
The solution is simple, don't purchase a smart speaker.
If you insist on having one though, each device should have an option under the privacy settings to opt out of allowing Amazon, Google, etc. to use audio data for product improvement. As for your phone, turn off your virtual assistant and do an audit to make sure no apps have unnecessary access to your microphone. Any smart speaker connected to the internet can receive commands from Amazon, so there is no for sure way of knowing the microphone is turned off. Keeping these devices out of your house is the only complete solution. Otherwise, some stranger may hear whatever you say to or around your smart speaker, and it may not just stay with them.