Unit 2, Eliburn Office Park, Livingston, EH54 6GR
+441506668613

Two More Security Holes In Voice Assistants

Providing IT support and solution to small and medium businesses. Servicing Edinburgh, Livingston, Fife and surrounding areas. Responsive, Flexible, Professional and friendly local support.

Two More Security Holes In Voice Assistants

Researchers from Indiana University, the Chinese Academy of Science, and the University of Virginia have discovered 2 new security vulnerabilities in voice-powered assistants, like Amazon Alexa or Google Assistant, that could lead to the theft of personal information.

Voice Squatting

The first vulnerability, outlined in a recent white paper by researchers has been dubbed ‘voice squatting’ i.e. a method which exploits the way a skill or action is invoked. This method takes advantage of the way that VPAs like smart speakers work. The services used in smart speakers operate using apps called “skills” (by Amazon) or “actions” (by Google). A skill or an action is what gives a VPA additional features, so that a user can interact with a smart assistant via a virtual user interface (VUI), and can run that skill or action using just their voice.

The ‘voice squatting’ method essentially involves tricking VPAs by using simple homophones – words that sound the same but have different meanings. Using an example from the white paper, if a user gives the command “Alexa, open Capital One” to run the Capital One skill / action a cyber criminal could create a malicious app with a similarly pronounced name e.g. “Capital Won”. This could mean that a voice command for Capital One skill is then hijacked to run the malicious Capital Won skill instead.

Voice Masquerading

The second vulnerability identified by the research has been dubbed ‘voice masquerading’. This method of exploiting how VPAs operate involves using a malicious skill / action to impersonate a legitimate skill / action, with the intended result of tricking a user into reading out personal information / account credentials, or to listen-in on private conversations.

For example, the researchers were able to register 5 new fake skills with Amazon, which passed Amazon’s vetting process, used similar invocation names, and were found to have been invoked by a high proportion of users.

Private Conversation Sent To Phone Contact

These latest revelations come hot on the heels of recent reports of how a recording the private conversation of a woman in Portland (US) was sent to one of her phone contacts without her authorisation after her Amazon Echo misinterpreted what she was saying.

What Does This Mean For Your Business?

VPAs are popular but are still relatively new, and one positive aspect of this story is that at least these vulnerabilities have been identified now by researchers so that changes can (hopefully) be made to counter the threats. Amazon has said that it conducts security reviews as part of its skill certification process, and it is hoped that the researchers’ abilities to pass-off fake skills successfully may make Amazon, Alexa and others look more carefully at their vetting processes.

VPA’s are now destined for use in the workplace e.g. business-focused versions of popular models and bespoke versions. In this young market, there are however, genuine fears about the security of IoT devices, and businesses may be particularly nervous about VPAs being used by malicious players to listen-in on sensitive business information which could be used against them e.g. for fraud or extortion. The big producers of VPAs will need to reassure businesses that they have installed enough security features and safeguards in order for businesses to fully trust their use in sensitive areas of the workplace.

Book a free consultation call
Close