Send email Copy Email Address

2021-09-10
Annabelle Theobald

How secure is convenient?

In his SOUPS paper, CISPA researcher Alexander Ponticello sheds light on new authentication mechanisms for voice assistants.

"Alexa, transfer 20 euros to Sabrina" - "Okay, now tell me your PIN" - this is what many a conversation in living rooms sounds like in the US. There, it is already possible to conduct banking transactions via Cortana, Siri, Alexa, and others. It is probably only a matter of time before these services are also offered in Europe. Users also have to authenticate themselves to their digital assistants for other security-critical services, such as unlocking and locking the front door by voice command. Until now, this has usually been done by saying a PIN out loud. But is that secure enough? To ensure that the security needs of users are taken into account when designing such authentication mechanisms, Alexander Ponticello is researching which risks users fear, how they perceive interaction with their voice assistants, and what makes them feel secure. The South Tirol native, who conducts research under the direction of CISPA faculty Dr. Katharina Krombholz, presented his paper, "Exploring Authentication for Security-Sensitive Tasks on Smart Home Voice Assistants," at the Seventeenth Symposium on Usable Privacy and Security (SOUPS 2021).

"Voice assistants are now in many homes and bedrooms, they are built into smartphones and cars, and their fields of application are rapidly expanding. As the services expand further into sensitive areas of life such as health or finance, the demands on their security and trustworthiness also increase enormously," explains Alexander Ponticello, who researches usable security at CISPA. But what exactly do people need to trust systems, and what price are they willing to pay for this security? To find out, the 28-year-old conducted a qualitative study in which he asked voice assistant owners about their usage behavior and perception of security. 

"We gained some exciting insights in the process." Most of the 16 study participants did not trust the loudly spoken PIN as an authentication method. Instead, they wanted identification via biometric voice recognition - or a combination of both security functions. When authenticating via a PIN, users are primarily concerned that others could listen in. This concern included not only strangers or criminals but also partners, friends, and children, depending on the type of interaction. Almost all study participants stated that they basically trusted their friends. However, many did not feel comfortable with the idea that they knew the authentication PIN for bank transfers. Some users would also like a more discreet authentication mechanism in the presence of children. They fear that, depending on their age, they could use the PIN to load money onto their cell phones, for example. Trust in partners was greater. Users were particularly concerned about the idea that knowledge of the PIN could be abused after a breakup.

It also makes a difference for the users in which rooms the assistants are located. While the bedroom is generally regarded as a private room in which the level of protection during authentication can be lower because no eavesdroppers are expected there, the protection in the living room, for example, must be much stronger, in the opinion of many study participants. "There were also indications that there are differences depending on the living situation. For example, statements indicated that city dwellers often feel that someone can always listen in, while rural dwellers do not worry about this. Therefore, the devices and their security mechanisms should be adaptable to the individual situation from the user's perspective. A "discreet mode" or the ability of devices to recognize where and in what context they are being used could in the future allow users to use sensitive services without attracting much attention. "However, that brings new privacy challenges," Ponticello says. The concern that the companies behind the voice assistants could pass on the shared data is already a concern for almost all participants.

Most participants are aware that the most convenient solution is often not the most secure, and they accept this trade-off. For them, the use of voice assistants goes hand in hand with a lifestyle. "It's not just about pure functionality. Rather, lightheartedness and a certain incidental quality are associated with the devices." Many users perceive digital assistants as a kind of butler and attribute almost human traits to them. Security mechanisms for such platforms must not destroy this feeling. "That's a tough limitation from a security perspective," Ponticello explains. Simply transferring security mechanisms from other platforms to these systems is not possible for this reason. Only if users understand and accept them can the mechanisms serve their purpose. "Otherwise, application errors are inevitable."

Trust in applications and services usually builds up over time. Since voice assistants have not been on the market that long, many users are still skeptical. "However, there are many important and useful fields of application for the services. For example, voice assistants can make life much easier for the blind and visually impaired, as well as for the elderly," says Ponticello. "So it's definitely worthwhile to continue research here and make the devices safe for such applications."