Google Home smart speaker in home environment

Your Privacy May Feel Safer at Home – But Your Smart Speaker Is Listening, and It’s Creepy

Alexa and Siri continue to be the most popular folks around, especially as most Americans continue spending more time in their homes due to the COVID-19 pandemic. After sales of smart speakers skyrocketed by 70% last year, Americans are relying more and more on these handy devices for entertainment and news while stuck inside. But new research, conducted by the Ponemon Institute, reveals a darker picture: Sixty-nine percent of survey respondents say they are very concerned about protecting their data privacy when using these smart devices. And if recent information on data security shows anything, it’s that American consumers are right to worry.

Our smart devices are powered by artificial intelligence that the parent companies are always working to strengthen and improve. That might sound good, but the reality is far uglier. When asked the one location they trust most when shopping, banking and performing other financial activities online, 46% of consumers say their home is the most trusted. However, these convenient devices in our homes can actually falsely activate, on average, up to 19 times per day. Employees of these companies listen to and review the soundbites these devices collect. Oftentimes, this information is benign enough – “Alexa, what is the temperature?” “Siri, where’s the nearest gas station?” – but other times, it can get a little too personal. Apple contractors who analyzed the data to improve its artificial intelligence reported hearing couples having sex or conversations about confidential medical information. Amazon workers said they heard what sounded like a sexual assault. And with 52% of voice assistant users saying they’re using their devices nearly every day during the coronavirus outbreak, smart speakers have an even greater chance to listen in than ever before.

And it’s not just audible content that these devices track. The Amazon Nest has a motion sensor inside that monitors whether or not someone is moving around the house. That means the device knows not only if you change the temperature, but also if you wake up in the middle of the night to use the bathroom or let the dog in. Smart lights store information about when you turn out the lights for the night and transmit that data to the parent company.

For many Americans, the result of all this data collection is definitely creepy. Sixty-six percent of consumers report they have received online ads that are relevant to but not directly based on their online search behavior and publicly available information. And a solid 64% of consumers say they think it is “creepy” when that happens. How could it not be creepy, when companies are able to use our personal data to anticipate our wants and needs before we’ve even formulated them? That’s a degree of online exposure that just doesn’t sit right with our ideas of personal privacy.

Given these realities, you’d think that we’d all be rushing to pitch the devices out the window. But the opposite is the case: Marriott has brought Alexa-powered Echo Dots into their hotel rooms, which I had the chance to experience during a recent stay. Marriott insists that it won’t be able to listen to the content of guests’ conversation with Alexa. But even if this is true, it’s certainly not an encouraging sign to see a company with such a poor track record on data privacy partnering with a company whose track record is even worse.

For-profit corporations aren’t the only sector that have decided to incorporate smart speakers into their work. Hospitals have as well. Cedars-Sinai Medical Center in Los Angeles is putting them in patients’ rooms to help with basic things like changing the TV channel or calling a nurse for help getting to the bathroom. Boston Children’s Hospital is using them to help staff keep track of information like available beds. It has also developed an Alexa skill called KidsMD, where parents can ask Alexa questions about what to do if they have a sick child. The goal is to minimize emergency room visits when they aren’t strictly necessary, allowing doctors to focus on patients in need of the specialized care hospitals can provide.

No matter how worthy our goals, we cannot allow ourselves to slip into a world where every patient, every hotel guest and every customer finds themselves sharing their personal data in public spaces without their consent. I urge individuals who have these devices in their homes to research the privacy protections available to them with current software and change their preferences. As of May 2019, neither Amazon nor Apple allowed users to request that their smart speakers stop recording altogether, though you could go back and delete earlier recordings. You can, however, do so with Google Assistant – and you should, as soon as you can.

Similarly, companies and corporations considering integrating this technology into its operations should wait until better privacy protections are available. The amount of convenience they offer is not worth the privacy we surrender to them.

Research shows 69% of respondents say they are very concerned about protecting their data #privacy when using these #smartdevices. #respectdataClick to Tweet

Above all, however, I urge our lawmakers to intervene. We are not free if we cannot control when companies are collecting data, and the numbers show that 60% of consumers think government oversight is required to protect privacy. America exists because we believe the real power belongs to the people. Big tech companies have done their best to take our control over our privacy from us. It’s time we took it back.