13 Jun Lacking security and privacy in voice assistants leave you dangerously exposed!
There’s a cute side and a dark side to the recent story of a six-year-old managing to order a $170 dollhouse via an Amazon Echo device, without Mom and Dad knowing. Unfortunately, the cute side wears thin rapidly, while the dark side could reach depths you never dreamed of. The root problem, in this case, was the acute sensitivity of Alexa, the voice assistant associated with the Echo, to human voice commands, coupled with Alexa’s inability to differentiate between voices of different people.
For example, if a TV reporter talking about the dollhouse incident quotes the six-year-old, by saying something like, “Alexa, order me a dollhouse”, and your Echo is listening to your TV, it will likely go into ordering mode on your Amazon account. Which is what happened to several unsuspecting viewers and Echo owners, when the story was aired.
On the other hand, once the order has been given, the information flows to coordinate payment and delivery of that dollhouse you never wanted may be relatively secure. The insecurity is at the beginning when anybody, family, friend, foe, or TV anchorman can issue commands. Your privacy can then end up in shreds, if nosy parkers start querying Alexa for details of your private life, or if law enforcement agencies subpoena your own conversations with Alexa.
Businesses may not escape this problem either. Amazon’s Echo is already being used by at least one forward-looking business intelligence software vendor, to provide voice-activated access to chatbots and natural language processing, so that you can get facts, figures, and trends by simply saying what you want. Granted, you may not have hooked up Echo to your corporate bank accounts, but still, do you really want it to squawk out details of your most profitable customers to anybody who happens to be passing by?
Thinking you’ve fixed everything by locking down security inside a voice-activated voice assistant could leave you highly vulnerable in terms of privacy, and vice versa. To stay safe in business, at home, or wherever, both aspects must be addressed:
· Privacy. Make sure you know how to turn the voice assistant off or stop it from listening. Use a security code to help make sure your voice assistant will only react to your commands – Amazon offers the possibility of defining a 4-digit code for this. Use delete functions (after checking they really do delete) to wipe out conversations or indiscretions picked up and stored in memory by the assistant.
· Security. Check that any voice assistant app does not ask for unreasonable permissions at installation or startup time. Make sure the app itself does not contain vulnerabilities, suspect code, or links to sites and resources with bad reputations. You can use free, high-quality access to a service like Mi3 Security’s Threat Center to check an app you or your business want to use, or are in the process of developing.
Never assume that voice assistants or similar devices or apps are safe. While this kind of Internet of Things end-point is now increasingly popular, default privacy and security levels may be sadly inadequate. So, take steps to make sure you aren’t the bemused receiver of a dollhouse tomorrow, or the stunned victim of industrial espionage, all because of a voice assistant app that didn’t know right from wrong.
This article originally appeared on the Mi3 Security blog.