What are the Priorities for Data Protection and Privacy Officers in 2020?    
Virtual assistant device showing how virtual assistant could be betraying consumer trust
Hey Google, Are Virtual Assistants Betraying Consumer Trust? by Byron Muhlberg

Hey Google, Are Virtual Assistants Betraying Consumer Trust?

The more humanlike a virtual assistant appears to us to be, the more consumer trust is placed in the device, a new study reveals, raising fresh concerns around Big Tech and data protection.

According to the up and coming study by the University of Waterloo, when developers give virtual assistants anthropomorphized (‘humanlike’) characteristics, people tend to feel more inclined to share their personal information with the devices. These characteristics include the tone and quality of voice of the virtual assistant, as well as how its physical and emotional qualities are perceived.

The multi-phase study — undertaken by a joint team of researchers from Waterloo’s mathematics and computer science faculties — is scheduled for presentation in April 2020. It promises not only to shed light on the extent to which people trust their virtual assistants, but also to provide fodder for privacy advocates who believe that brands could misuse the trust people place in digital assistants.

Many such products have burst onto the scene in recent years, with conversational voice assistants like Siri and Alexa having gained a significant growth record and the industry as a whole enjoying high consumer trust.

Before you continue reading, how about a follow on LinkedIn?

Although they take on a variety of shapes and forms, the typical virtual assistant makes use of smart speakers and other smart home devices, combined with artificial intelligence, in order to mimic an authentic human interaction through conversation.

Reimagining the virtual assistant

The researchers invited 10 men and 10 women to interact with the three most popular virtual assistants available today — Amazon Alexa, Google Assistant, and Apple Siri. The subjects were asked to respond to how they had perceived each of the virtual agents to measure consumer trust. Finally, they were asked to describe what each digital assistant would look like physically if she were a human being.

By combining the subjects’ responses, the researchers were then able to map out a physical profile for each virtual assistant and to ascertain the level of the consumer trust people have when interacting with them.

Alexa, for example, was thought of as being authentic and kindhearted, while Siri, on the other hand, was regarded as being crafty and dishonest by comparison. The subjects also believed that both Google Assistant and Siri seemed to possess more markedly individualistic personalities than Alexa, who was seen as being somewhat more neutral and generic in comparison.

When they were asked to physically describe each virtual assistant, the subjects marked Alexa as being shorter in height than the others. Google Assistant was seen as being of average height while Siri was regarded as being the tallest of the three. Alexa was also thought of as being the oldest of the three virtual assistants, while Siri was imagined as being the youngest.

The subjects also categorized the virtual assistants along other descriptive categories; including hair color, attire, and whether or not each assistant was thought of as wearing glasses.

Where convenience ends, consumer trust begins

In recent years, the virtual assistant industry has witnessed an explosive period of growth at the expense of consumer trust. Big tech firms, ever eager to expand the capabilities of their virtual assistant technologies, are continuously exploring new avenues through which they can integrate their products with sectors as diverse as banking, healthcare, and education.

In September 2019, for example, United Arab Emirates (UAE) based bank Emirates NBD announced the launch of ‘voice banking’ in partnership with Amazon Alexa. According to the official press release, the technology allows Emirates NBD customers to have their bank balance and recent transactions read aloud to them by Alexa. United Kingdom (UK) based bank National Westminster Bank (‘NatWest’) is currently pursuing a similar endeavor using Google Home — a feature which it began trialing in August 2019.

By making life increasingly more convenient, developments such as these could represent unparalleled changes to the way in which ordinary people go about their day-to-day lives. However, in spite of the clear benefits on offer, as virtual assistant technologies begin to gobble up new types of data in increasingly large amounts, the central question becomes less about convenience than it does about consumer trust.

Certainly, in the wake of prolific breaches of personal data such as the Facebook-Cambridge Analytica data scandal of 2018, consumer trust for big tech has been slumping into decline. This is backed by a 2019 Microsoft report, which suggests that 41% of conversational assistant users were apprehensive about “trust, privacy and passive listening” relating to their device.

Study shows that people tend to feel more inclined to share their personal information with #virtualassistants if they are more ‘humanlike’. #respectdata Click to Tweet

It remains difficult to estimate how customer trust for digital assistants will fare in the coming years. For the time being, however, as natural language processing technology continues to become more capable, it remains likely that users will increasingly need to walk the tightrope between consumer trust concerns, and simple convenience.

 

Enjoyed the article?

Get notified of new articles and relevant events.

Thank you for being a part of the CPO Magazine community.

Something went wrong.

Before you go ...

How about giving us a follow? lang: en_US Or let us notify you of new articles and relevant events.

Thank you for being a part of the CPO Magazine community.

Something went wrong.

 

Follow CPO Magazine