VOICE ASSISTANTS - SMART SPEAKERS
TOPICS BELOW
Amazon Alexa/Echo, Apple,
Google Assistant, Microsoft
NOTE: This page was ignored for a few years as I am not a big user of any of these systems. Amazon changed the rules for Alexa in March 2025 which prompted the latest updates and finally moving this content to its own page.
All of the smart assistants (from Amazon, Google and Apple) sometimes record at the wrong time. That is, they record without a person having said the wake word. And, since all three companies send some recordings to contractors, to help improve the system, strangers may hear your embarrassing conversations. Tony Soprano would not have allowed Siri in his home.
The rules about this have changed over time. Amazon used to let customers manually delete past recordings and disable human review of Alexa recordings. This changed drastically in March 2025. Google used to let you access your history, delete past recordings and automatically delete your data every couple of months. Initially, Apple did not have any way to opt out. In early Aug 2019 they took their first step and did more in iOS 13.2.
October 2019: Alexa and Google Home abused to eavesdrop and phish passwords by Dan Goodin for Ars Technica. Everyone's worst fear came true. Malicious apps were developed that listened all the time. Wake word? We don't need no [expletive] wake word. Germany's Security Research Labs developed the apps and they passed the Amazon and Google security-vetting process. Some of the apps logged all conversations within earshot of the device and sent a copy to the app developer. Others mimicked the voice used by Alexa and Google Home to falsely claim a device update was available and prompted the victim user for a password to enable the update. Yikes. More: Malicious Apps on Alexa or Google Home Can Spy or Steal Passwords by Ionut Ilascu for Bleeping Computer (Oct. 2019).
AMAZON ALEXA top
March 2025: The rules about talking to Alex are changing. Previously, Alexa devices could process your spoken instructions on the devices themselves. No more. Now, everything you say after the wake word is sent to Amazon. If you don't want Amazon to have access to recordings of everything you say to Alexa, your only choice is not to use the Echo device at all. If you continue to use Alexa, you have to decide whether to let Amazon save voice recordings or not. By default, Amazon will delete recordings of your Alexa requests after processing. However, this means that Echo users can no longer use the Voice ID feature which lets an Echo device distinguish the
different voices talking to it. Voice ID is set to become more advanced and central to the next generation of Alexa. On the other hand, if Amazon saves
your voice recordings, then employees and/or contractors will listen to them.
Everything you say to your Echo will be sent to Amazon starting on March 28 by Scharon Harding for Ars Technica. March 14, 2025
To stop an Echo from saving voice recordings, using the Alexa app:
Settings -> Alexa Privacy -> Manage Your Alexa Data -> Don't save recordings
April 2025: If you want to continue to use an Alexa device after the above changes, consider unplugging it when not in use, when people are visiting or when having certain conversations.
Documentation From Amazon
Should we trust Amazon?
April 2019: Bloomberg reported that Amazon Workers Are Listening to What You Tell Alexa. There are options in the app to disable this (Settings -> Alexa Account -> Alexa Privacy -> Manage How Your Data Improves Alexa) but they may not be honored.
April 2019: Another privacy issue with Alexa is that the devices phone home to Amazon and to others, even when they are not being used. No one knows why. From: This Simple Tool Will Reveal the Secret Life of Your Smart Home by Kashmir Hill for Gizmodo.
May 2019: Alexa has been eavesdropping on you this whole time by Geoffrey Fowler for the Washington Post. Amazon keeps a copy of everything Alexa records after it hears the wake word. Fowler listened to 4 years of his recordings and found that dozens of times it recorded when it should not. It even picked up some sensitive conversations. There are instructions for deleting these recordings via the Alexa app. Hear your archive at www.amazon.com/alexaprivacy.
May 2019: Also from Fowler: Amazon collects data about third-party devices even when you do not use Alexa to operate them. For example, Sonos keeps track of what albums, playlists or stations you listen to and shares that information with Amazon. You can tell Amazon to delete everything it has learned about your home, but you can not look at this data or stop Amazon from continuing to collect it.
ALEXA SKILLS
Alex initial configuration: the app wants to "periodically upload your contacts" - say Later (there is no NO). The app also wants to verify your phone number when first configured, there is no need for this, skip it.
Alexa Defenses in the Settings of the Alexa app: (last reviewed in 2020)
APPLE (Siri, Apple Watch and HomePod smart speakers) top
July 2019: Apple contractors 'regularly hear confidential details' on Siri recordings by Alex Hern in The Guardian. Accidental activations pick up extremely sensitive personal information, fairly often. The story came from a whistleblower; not a good look for Apple.
If an Apple Watch detects it has been raised and then hears speech, Siri is activated. To prevent this, disable the Siri side button on the iPhone:
Settings -> Siri & Search -> toggle off "Press Side Button for Siri".
August 2019: Apple Suspends Listening to Siri Queries Amid Privacy Outcry by Mark Gurman of Bloomberg.
Defense as of Aug 2019: If both Siri and dictation are disabled, Apple will delete your data and recent voice recordings. To disable Siri:
Settings > Siri & Search -> Turn off both the Listen and Press Button options
To disable dictations:
Settings -> General -> Keyboard -> turn off Enable Dictation
This process will change.
October 2019: Defense added in iOS 13.2. When upgrading to 13.2, which was released in Oct. 2019, users see a pop-up message offering the ability to opt-out of having their voice commands stored and saved. It is called "allowing Apple to store and review audio of your Siri and Dictation interactions". Later, this can be adjusted in the Privacy settings under "Analytics & Improvements" where there are multiple options about sharing Analytics as well as the option to "Delete Siri & Dictation History" and an option to stop sharing voice recording with Apple. Also in Settings -> Siri, you can tell Apple to delete all the Siri voice recordings that it has stored.
GOOGLE ASSISTANT (Siri, Apple Watch and HomePod smart speakers) top
Again from Fowler article: Google used to record conversations with its Assistant ("Hey Google") but in 2018, they stopped doing so by default on new setups. You can check the settings of your Assistant at myaccount.google.com /activitycontrols/audio. Look to Pause recordings. This How-ToGeek article adds instructions for deleting the previously saved recordings.
The Nest thermostat, made by Google, phones home every 15 minutes, reporting the climate in the home and whether there is anyone moving around. The data is saved forever. (also from the Fowler article)
Google Defense: in the Google Home app: Account -> More settings (under Google Assistant) -> Your data in the Assistant -> turn off Voice & Audio Activity. While there, also go to Manage Activity to review and/or delete voice recordings.
To delete Google Assistant voice recordings, start at myaccount.google.com /intro/activitycontrols. Scroll to "Voice & Audio Activity" where Paused means disabled. Or, you can use these voice commands: "Hey Google, delete what I just said" or "Delete what I said on [date]" or "Delete my last conversation". This only works for the last 7 days.
You can use the Voice Match function to insure your personal results are only available to you. See how.
MICROSOFT: SKYPE, CORTANA and XBO top
In Aug. 2019, Joseph Cox of Motherboard revealed that "Contractors working for Microsoft are listening to personal conversations of Skype users conducted through the app’s translation service ... [and] ... Microsoft contractors are also listening to voice commands that users speak to Cortana, the company's voice assistant." Shortly thereafter, Cox revealed that Microsoft Contractors Listened to Xbox Owners in Their Homes. As with all the other companies, recordings were sometimes triggered by mistake. At the Microsoft Account Privacy Settings page you can delete any recordings Microsoft has of you.
This page: 7 views per day (over 41 days) Total views: 298 Created: March 19, 2025 |
This Page Last Updated March 19, 2025 | Site Page Views TOTAL 1,178,123 | Site Page Views TODAY 566 |
Website by Michael Horowitz |
top |