Catch Of The Week: Alexa & Google Voice In The House

By BECKY RUTHERFORD
Los Alamos

“My son asked why I speak so softly in the house … I said I was afraid Mark Zuckerberg was listening. He laughed. I laughed. Alexa laughed. Siri laughed.”

So runs a popular internet meme about privacy, or the lack thereof, in today’s connected world. Funny, right?

Amazon Echo and other voice assistant devices (like Google Home) are great and can make life easier but at the cost of privacy. As of January 2019, Amazon had sold more than 100 million Alexa enabled devices. One in five adults owns a voice assistant, with Alexa having 70 percent market share and Google  24 percent. How much trust can we put in these digital assistants?

Stories of Alexa gone rogue are easy to find online. In San Francisco, a user claimed Alexa cheerily stated: “Every time I close my eyes, all I see are people dying.” An Amazon customer in Germany was mistakenly sent 1,700 audio files from someone else’s Echo in 2018. The audio files contained enough information to name and locate the user and his girlfriend. Amazon stated this was due to human error.

In another case, Alexa recorded private conversations and messaged them to a user’s employee.

But Alexa only listens when you say the wake word, “Alexa”, right? For now, yes, but Amazon did apply for a patent for an “always-on” feature for Alexa this year. Not to mention, the devices aren’t the only ones listening to you. Amazon and Google both employ teams of contractors that transcribe a small percentage of all voice commands captured after the wake word is detected and feeds them back into the software to help Alexa learn to understand human speech and become more efficient.

You can opt-out of Amazon using your voice recordings in the privacy section of the Alexa app. Employees listening in do not have direct access to customer’s personal information but can access the device’s serial number and Amazon account associated with the device. If this makes you nervous, both devices feature a “mic off” button that can switch off the microphone when not in use.

Recently security researchers were able to develop malicious apps that can be used to hack both Google and Amazon devices. Amazon and Google don’t verify and vet Alexa and Google Home apps when they are submitted (they do verify the apps at subsequent updates). This policy enables bad actors to submit malicious apps. SRL was able to upload these malicious apps to the Amazon and Google app stores and ran tests to confirm the attacks would work. SRL did remove the malicious apps after the testing was completed and notified Amazon and Google. Neither company has addressed this issue, but both companies responded that they are working on ways to protect customers against malicious apps.

SRL has dubbed the flaw “Smart Spies”. How the attack works: All smart speakers feature a pause between the moment it finishes recording, and when it begins speaking. This pause can be altered by inserting a command into the malicious app’s code to have the device speak an unpronounceable string of characters (Unicode characters). These unpronounceable characters cause the device to remain silent while the app is still running, giving the impression it has terminated when it is still listening. This would allow the app to continue recording the user without their knowledge.

The flaw could also potentially be used to carry out a phishing attack against the user. The app continues to run after you think it has stopped. It could wait for an hour or so, then speak a fake error message requesting credentials and steal this information to access your Amazon/Google account. Amazon and Google have stated that voice assistants will never request your credentials.

The best way to protect yourself from malicious apps is to be cautious. Only download Google Home and Amazon Alexa apps from trusted developers. How can you tell if an app developer is trustworthy?

Check out the reviews, are there just a few, or are there thousands? If there aren’t many reviews, the app is likely new and may not be trustworthy.

Look at all the results before selecting; make sure you are selecting the legitimate app and not a lookalike.

Read the app’s description; are there misspellings and bad grammar? This can be a red flag that the app is not to be trusted.

Always watch out for permissions- follow the rule of least privilege. Only give an app the permissions it needs to run. E.g. a horoscope app should not need access to all your contacts, payment information, social media logins, location, etc.

Remember, these voice assistants may be convenient, but the loss of privacy is no laughing matter. I’ve got to go; my Alexa and Google Home are having another argument…

CSTsiteisloaded