Is Amazon’s Alexa, Google Home Recording Your Personal Conversations at Home or Office?
The users, if needed, can delete specific voice recordings associated with their accounts by going to History in Settings in the Alexa App, drilling down for a specific entry, and then tapping the delete button.
New Delhi, February 19: At a time when digital assistants in smart devices at home or office are talking to us like never before, some users have begun to worry: Is Alexa or Google Home listening and recording personal conversations beyond the "wake" word? There are multiple triggers to such concerns, the latest one being a person in Germany using Amazon's voice assistant who received 1,700 audio files from a person he never met. A woman in the US state of Oregon was in shock last year when the Amazon Echo device at her Portland home recorded a private conversation and then shared it with one of her husband's employees in Seattle.
Amazon later clarified that Alexa mistakenly heard a series of commands and sent the recording as a voice message to one of the husband's employees. The threat is very much real, with more and more Indians being hooked to the always-on and Internet-connected smart home devices. In a latest Forrester report titled "Secure The Rise Of Intelligent Agents", Amy DeMartine and Jennifer Wise argue that currently, introductory versions of intelligent agents include Alexa, Cortana, Google Assistant and Siri. However, security is not part of the equation, and unless security pros get involved, the implications are more worrisome for businesses than normal human beings.
"Alexa doesn't currently authenticate or authorise individuals who access it, leaving a company's Alexa skills unprotected from anyone who can remember another user's commands," reads the report. "A hacker has already developed a method to install malware on a pre-2017 Amazon Echo that streams the microphone to any remote computer, accesses the owner's Amazon account, and installs ransomware," the Forrester report added.
Apple logs and stores Siri queries but they are not associated with an Apple ID or email address, and the company deletes the association between queries and their numerical codes after six months. Amazon and Google devices, however, save query histories until the customer deletes them, and Microsoft Cortana users must manage their own data retention preferences in the Cloud and on their devices.
According to Puneesh Kumar, Country Manager for Alexa Experiences and Devices, Amazon India, the threat of Alexa recording all your conversations is not real as the company has created layers of privacy protections in all of its Echo device. "It includes a mute button involving a hardware press that electrically disconnects the microphones and cameras, clear visual indicators when utterances are being captured and streamed, as well as the ability to see and delete voice recording history for their devices," Kumar told IANS.
Echo speakers use on-device keyword spotting to detect the "wake" word and only the "wake" word. When the "wake" word is detected, the light ring around the top of the device turns blue to indicate that Alexa is streaming audio to the Cloud. "At any time, you can turn the microphone off by pushing the microphone button on the top of the device and this creates an electrical disconnect to the mic, which will turn on a red ring to visually indicate that the device is muted," informed Kumar.
According to Amazon, the voice utterances spoken to the device may be used in order to deliver and improve its services. The users, if needed, can delete specific voice recordings associated with their accounts by going to History in Settings in the Alexa App, drilling down for a specific entry, and then tapping the delete button. You can also delete all voice recordings associated with your account for each of your Alexa-enabled products.
(The above story first appeared on LatestLY on Feb 19, 2019 03:21 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).