This holiday season, shoppers around the world may be considering what kinds of gadgets to buy for family and friends.
From wearable health monitors, to smart speakers and in-home security systems, consumer electronics that collect and share data are increasingly common and affordable.
The Thomson Reuters Foundation asked Alexis Hancock, a digital rights expert at the Electronic Frontier Foundation, what privacy risks consumers should keep in mind when considering these kinds of technologies.
What should consumers considering buying gadgets that collect personal data consider?
I would suggest thinking about the exact need they are trying to fulfill; how exactly are you going to make your life better – sometimes it doesn’t require an internet connected device.
Not every tool needs to come with an app; you can get cool stuff without a privacy risk.
But once you connect things to the internet, you need to worry about all sorts of things: secure passwords, encryption, and of course the privacy practices of the companies and what they do with the data they’re collecting.
We know that our data is being collected all the time; shouldn’t we just let it go?
So, we try to combat a lot of this kind of nihilism at the Electronic Frontier Foundation; people may think: “They already collected so much data, I should just accept it.”
What we say is that pushing back can be good and consumers can have an impact.
Last year, for example, after consumer outcry about the microphone in Google’s Nest (a home security system which failed to disclose its built-in microphone), they changed it… and now there’s a physical kill switch on the mic.
Nest is one of many home security camera systems for sale – what, if any privacy risks should people consider?
Again, I would suggest people think: what problem are you trying to solve? Do you really need a company–affiliated camera and doorbell device in your home?
With a lot of these products they sell themselves as being about safety and security, but many of the business practices we’ve seen come out are concerning – especially the amount of data sharing with police or other entities.
So people should ask themselves: are you buying something that’s going to keep you safe, or are they buying another layer of surveillance for your loved ones?
How does that thinking apply to in-home smart speakers and assistance devices?
If you are worried about companies sharing or capturing your data, I would research what privacy measures are built into the devices – do they allow you to cut off the mic, can you disconnect from the internet?
If they don’t function without the internet, is your home internet safe? Remember: the more devices you add in your home, the more routes there are for your data to be shared.
Also, something to consider for voice-activated devices: the artificial intelligence isn’t always built for everyone, and they might not recognize certain kinds of accents or voice types.
And what about wearable devices like watches that monitor your steps or heartbeat?
There are valid reasons to buy these kinds of tools. I enjoy products that help me understand me better, but I don’t enjoy sending that data off to end points of the internet that I don’t know about.
And if these devices use AI to make recommendations about my sleep, or tone of voice, they should be transparent about the models they are using – I want to know how they are arriving at these recommendations.