Is Your Home Spying on You? How Smart Speakers and Homes Erode Privacy
Apple has finally entered the smart speaker game with the HomePod, and virtual assistants like Alexa and Google Assistant are gaining popularity with Siri having paved the way. Smart speakers are a huge convenience, especially if you have compatible smart home devices, but there is a significant downside for privacy. I’ve been writing about tech for well over a decade, but I’m more of a skeptic on smart speakers than an early adopter.
I’ve long had concerns that as we add more smart devices to our everyday lives, we are ceding our privacy rights. Our data is already on servers around the world in the form of web searches, browsing activity, navigation, messaging, and more. Now our voices are part of the data mix, and the current generation of toddlers and young children won’t remember a world without them.
Smart speakers are always listening, but are only supposed to record audio when prompted. In late 2017, though, a reporter testing out the Google Home Mini discovered it was recording him 24/7. Google rolled out a software update to fix this bug, and eventually disabled a “top touch” button that was at the root of the problem. But what’s to stop something like this from happening again?
Another concern I have is law enforcement’s use of smart speaker data in criminal cases. Detectives in Arkansas, investigating a murder executed warrants against Amazon in an attempt to gain access to the suspect’s Echo, which they believed contained audio evidence (a witness had reported that music was playing from the Echo). Amazon refused to release the data, but eventually, the suspect allowed access to his Amazon Echo. Several months later, prosecutors dismissed the case, but without revealing whether they had gotten any evidence from the Echo.
Reading about this case, I couldn’t help but wonder what would have happened if the suspect had a defective unit that was indeed recording nonstop. Audio alone might not be enough to exonerate or implicate him, but how would that play in front of a jury? What if the prosecutors were able to play audio that made him look bad, but not necessarily guilty?
I got a hint at this type of future when I watched, with ever-growing dread, the Black Mirror episode “Crocodile.” In the techno-dystopian series’ fourth season, privacy takes a front row in a world in which insurance investigators (sanctioned by the government) capture memories related to insurance claims using a non-invasive device, but are privy to anything that comes to mind during the session. The episode might as well be titled “Slippery Slope.” I’m not too worried about a memory capture machine in the near term, but I do worry about treating voice data in the same way.
That’s not to say you should toss your smart speaker out the window. They’re still useful in a variety of ways, from reciting the day’s weather forecast as you get ready to turning up the thermostat without getting out of bed. But you should maintain a healthy skepticism.
Pay attention to software updates and do a regular housecleaning of your data (voice and typing). With the Amazon Echo, you can clear your history using the mobile app or Amazon website, while Google has a site for viewing and managing your activity (myactivity.google.com/myactivity.) However, it’s not clear whether or how long your data remains on either of the company’s services. Consider trying out the Mycroft Mark II, which is an open source Amazon Echo that doesn’t store any voice data by default; users have to opt-in.
Other good practices include:
- Limit the accounts and services you connect to a smart speaker
- Using two-factor authentication to prevent others from logging in.
- Train the speaker to respond only to your voice (or those of family members).
- Use a secure wireless network with a secure password.
- Turn off voice activation when you need the utmost privacy.