Man touching smart speakers showing the concern from Americans on the amount of data collected from smart speakers

Do Americans Care How Much of Their Data is Collected from Smart Speakers?

Short answer: some do, some don’t. Americans feel split on if they care if their smart home assistants collect their data. On the whole, though, it’s clear most Americans don’t feel comfortable disclosing their private conversations without permission.

Al-powered home assistants have received significant backlash over such privacy concerns. For instance, Google and Amazon came under fire for allegations that their employees and contractors eavesdrop on conversations to better improve smart assistant technology. Although this has been disputed, dispelled, and proven — and become more confusing for users — what is clear is that no smart home assistant user is off the hook.

In truth, everything people say in their homes (or within earshot of a smart assistant speaker) is recorded and stored for future use — even conversations that don’t begin with “Okay, Google” or “Hey, Alexa.”

Smart home speaker creators may want the audio recordings so they can further improve their technology, but the practice tows a fine line between analyzing data for the sake of improvement and eavesdropping.

Why do Americans feel split on the issue?

In many ways, smart home assistants make our lives infinitely more convenient. And when paired with other smart home technology, they can control lights, lock doors, and even adjust thermostats and water heaters — all through a voice command.

But as smart home assistants become more integrated with other smart home technology, many Americans feel too connected. People feel increasingly wary about their information in the hands of the government and private corporations

While Americans agree that AI-powered technology can be helpful, they wish they had more control over what information smart home assistance collect.

Do the risks outweigh the benefits?

Like most instances of hacking, your audio recordings can be used to find more information about you. If you order a pizza and share your credit card information over the phone, for example, a hacker could use the knowledge of your credit card information to find out more sensitive information about you (e.g., social security number, bank account information, ect.)

Unlike cameras, which project light to indicate that the camera is recording, smart home assistants don’t offer any indication they are listening. Most smart home assistants will turn off when users unplug them, but people often forget they are actively listening.

Creators of smart home assistants are looking into offering a way to indicate when a smart home speaker is active, similar to a camera’s red light.

In general, the more protected you keep your information, the less susceptible you’ll be to hackers. The problem is, people let their guard down at home.

The risk data collection poses is one of violation. Americans feel defensive and violated at the thought of private conversations being used for research purposes.

What’s the data used for?

Companies collect private data to fix bugs, upgrade model versions, and further improve their technology. The government collects data for public safety reasons. Americans are more accepting of data collection if the data is used for “positive” things that will improve the overall health and safety of a population.

Here’s a list of ways data from smart speakers can be used:

  • Public safety
  • Health or fitness tracking
  • Mental health/disease tracking
  • Terrorist attack prevention
  • Crime-solving

In general, the majority of Americans feel wary about using their personal data to aid criminal investigations.

49% of Americans think it’s unacceptable for companies to share their data with law enforcement, but the same number think it’s permissible for the government to collect data when used to prevent terrorist attacks.

How can people know if they’re at risk?

They don’t. There’s no way smart users can know if their audio recordings are being analyzed by companies. But we do know that it’s less likely for a hacker to infiltrate your data than that the parent company of your device is listening.

25% of adults own a smart speaker. But just because people buy smart speakers

doesn’t mean they aren’t concerned about eavesdropping. Of that percentage, over half are worried about the amount of data their devices collect on a daily basis.

Forms of hacking — separate from data collection — can be a bit more obvious. For example, there’s even speculation that hackers could use light lasers to control peoples’ smart home speakers, since any technology that responds to sound will also respond to light.

Of course, signs of fraud, identity theft, and hacking on other platforms may indicate a breach in security, but it’s hard for users to tie their smart speaker to the source.

How can you stay safe?

Depending on the safeguards people put in place for smart home technology, other people are more at risk than others. For instance, consumers that customize their smart device’s settings, mute their device, and make a habit of deleting stored data are better off than those that don’t.

The Amazon Echo is the top-selling smart speaker on the market. Luckily, it offers lots of options to control how much data your device records and stores. On the Alexa app, for example, users can turn off the setting “Manage How Your Data Improves Alexa.” In doing so, the device will no longer store audio for learning purposes. You can also mute an Alexa, Echo, or Google Home, by pressing the “On/Off” button on the top of the device.

In implementing these privacy settings, your device won’t listen to or respond to voice commands, and you’ll reduce the number of audio files your device automatically stores.

Smart speaker users should stay up to date on deleting their device’s audio files. Each day, users should delete that day’s worth of audio. Your device won’t be able to study the audio file and better learn from them, but in minimizing the amount of data there is, you decrease your risk of having your conversations analyzed.