New smart home devices like the Amazon Echo and Google Home are raising numerous legal and data privacy issues, primarily because these Internet of Things (IoT) are recording conversations that you have in your daily life. If you wouldn’t want your friend recording one of your conversations, would you want a digital device doing the same?
The Amazon Echo case
The one legal case that has people talking in 2017 involves the Amazon Echo, a voice-controlled digital device that sits passively in your home until it’s activated with your voice. This past holiday season, Internet of Things are the gifts most people want, and the Amazon Echo was one of the surprise hits of the season with consumers.
And now the Amazon Echo is at the center of a murder investigation in the United States, in which sounds recorded at home during an interaction with the device might provide some important clues to a bizarre murder case. In this case, a person invited some friends over to watch a football game at night and in the morning, there was a dead body in the backyard.
Prosecutors want access to the Amazon Echo data, confident that it can help to shed some light on the murder case. The thinking is that the Amazon Echo sensors might have activated and been in passive listening mode. If that’s the case, it might have recorded sounds of a fight or some argument.
The defendant, of course, is trying to block access to the device on privacy grounds. From this perspective, trying to get data from a smart home device like the Amazon Echo would be akin to getting data from a smart phone or a smart tablet. It’s a lot harder to obtain that data than you might think. Amazon, for its part, has refused to hand over any data in the investigation. From Amazon’s perspective, this is no different than an illegal wiretap case and an invasion of privacy.
And there’s one more wrinkle to this Amazon Echo case. The murder suspect also had a “smart” water meter hooked up to the house. So there was a second smart home object! Investigators now think that they can use data from this water meter to figure out if there was a spike in water activity in the middle of the night of the murder. That might give clues as to whether or not the suspect was using water to clean out the house and backyard of any blood or clues. As might be expected, the suspect is trying to block access to the data, also on privacy grounds.
New privacy cases involving the Internet of Things
You can think of this strange example as a test case for the Internet of Things and data privacy. We want our devices to be “smart” and “connected” – but we also don’t want them turning into “tattletales” that know too much about us. If you’re a teenager, for example, you don’t want your smart devices telling your parents that you were trying to access certain restricted websites. If you’re married, you don’t want your connected devices inadvertently telling your significant other about any potential infidelity. If you’re an office employee, you don’t want any smart devices telling your co-workers or bosses what you really think of them.
If anything, though, these types of test cases for data privacy rights will only become more frequent as more objects are built with networking and connectivity, to be hooked up to the Internet. Whether we like it or not, we are leaving behind a data trail every time we connect with these devices. Our smart refrigerators know if we’ve been cheating on our diets, and our smart fitness bands know if we’ve been following our workout schedules. That data is then shared to the cloud, where it is vulnerable to hackers.
Data privacy and smart “Internet of Things” toys
Sometimes, objects may be listening to us and watching us without us even giving a second thought to it. And companies are getting ever more creative, thinking up new applications of the technology, and collecting more data to be sent into their analysis systems. “Smart toys,” for example, represent another potential hazard. One toy that has people talking is the My Friend Kayla doll, which is a toy powered by artificial intelligence (AI) that’s capable of conducting basic conversations with your kids.
There’s just one problem here – the toy needs to be connected to the Internet in order to get answers to the questions that it is being asked by your child. The difficulty arises because hackers could theoretically hack into the toy and use it to monitor your child or even conduct conversations. There’s no password required to access the doll, so it’s completely vulnerable. Think of the Internet of Things as an evil Internet of Toys.
That concern is so real that, in Germany, there has been a push to get rid of the toy entirely. German telecom regulators have even labeled the doll an “illegal espionage apparatus” because it could be used to illegally spy on children. Retailers could face fines if they continue to sell it.
Artificial intelligence changes everything
For now, our intelligent devices hooked up to the Internet of Things are simply “smart.” They are not sentient or capable of emotions. Even AI-powered devices are not sentient, even if they are capable of learning. Thus, if you converse with an AI-powered chatbot on your iPhone or Android phone, you are fully aware that this is not a thinking machine capable of real emotions.
The same thing is true for Google Home, which has been programmed to give some funny responses to certain questions. On a first pass, you might think the device has been imbued with a sense of humor, but it is nothing more than a fancy computer science parlor trick.
Those are devices, but things are not as clear when robots are involved. A fascinating November 2016 article in The New Yorker (“If Animals Have Rights, Should Robots?”) recently outlined the case for giving certain rights to robots. For example, if you have a smart, humanoid robot – like Pepper the Robot, popular in Japan – what happens if you abuse that robot physically? If the robot is capable of experiencing emotions and subjective experiences, shouldn’t it receive the same rights that we give to our fellow humans?
Data privacy: Man vs. Machine
Where things really become interesting is when we start to think about the data privacy rights of these robots. Nobody would argue that his or her smartphone has “privacy rights.” But what about a smart robot hooked up to the Internet of Things? Going back to the Amazon Echo example, imagine the same scenario, but this time involving a smart home robot. In this case, the robot could be forced to “testify” against its owner during a criminal investigation.
For now, of course, many of these issues are simply interesting moral or philosophical thought experiments. However, the Internet of Things is starting to blur the line, especially when it comes to data privacy, but at what costs? So many smart technologies are now collecting so much information about our lives, that we are being forced to reconsider many of our most basic assumptions about privacy.