Language assistants such as Google Assistant and Alexa record what you say after the wake-up call to send to corporate servers. Companies keep your records until you delete them. In some companies, you can disable this behavior: How to do & # 39; s.
Voice assistants record their watchword.
Language assistants work in an uncomplicated way. You hear everything you say all day long. However, the device in the room does not have much intelligence. The only thing he can understand is his watchword: Alexa, Hey Google, Hey Cortana and so on.
Once it recognizes this wake-up word, it begins recording everything that follows (and a second later, when it thought it over) Word Wake). The device sends the log to enterprise servers (Alexa, Google, and so on) to find out what you've said and act on it.
However, after executing your command, companies will not necessarily delete the record. Instead, your spoken words are stored indefinitely to improve Voice Assistant's results and to determine new features.
In some companies, you can disable this behavior. And some not. If you deactivate the recording, the speech assistant will be completely interrupted, but this is not always the case. We summarized what you can do and what the results are.
Google is the Leader in Choice
Google is the only company that stands alone gives you the ability to use Google Assistant without having to save your vote forever. In a true leadership stage, this is now the default behavior for new users who have set up Google Assistant.
Existing users will be integrated into the old system for storing your voice recordings. You can disable this, however. Disabling speech storage is as easy as calling Google's activity control, disabling "voice and audio activity" and then clicking "pause".
Best of all, disabling Speech Memory does not affect Google Assistant or Google Home devices. So there's no reason not to turn it off if you're not interested in having large companies keep copies of your voice.
Alexa You do not have much choice
Amazon does not offer a Google-like option to prevent the storage of your voice recordings. If you use Alexa from an Echo device or an Alexa app, your voice will be processed and sent to Amazon servers. Amazon saves your records to improve Alexa.
You can only listen to and delete your recordings or you can do without an Alexa device. You can mute Echo devices, but this is not necessarily a permanent solution. If someone else has noticed that the device is muted and turns it back on, you are back where you started. In any case, mute will affect your ability to use Alexa at all, and your equipment will be destroyed.
Amazon provides a privacy dashboard in which you can tell the company not to use or enhance your voice recordings to develop new features or to improve transcriptions. Just click on the "Manage how your data improves Alexa" option, then deselect both options. However, you will find that Amazon does not use your data for these two purposes. It does not prevent your records from being stored or used for any other purpose.
Hopefully, Amazon will follow the example of Google and improve the offer options.
Cortana's only option is a shutdown button.
Similar to Amazon, Microsoft does not offer an option to prevent the storage of voice recordings. You can only view and delete existing records in the Microsoft Privacy Dashboard.
Worse than on Amazon, you can not even restrict how Microsoft uses your records. The only real option is to completely disable Hey Cortana. In the start search bar, type "Talk to Cortana", press Enter and turn off "Hey Cortana".
If you're using a Cortana speaker, you'll need to mute it. Of course you give up on Cortana almost completely. If you wish to use the Voice Assistant, you must presently consent to Microsoft storing your voice recordings for the purpose.
Siri will erase your records at least if you turn them off.
At the same time, Apple gives you the easiest way to erase your records and shortcuts to find the least useful options to prevent them from the beginning.
Just like Microsoft and Amazon are the only option to prevent Apple from storing your recordings, Siri is completely useless. In essence, the use of Siri is in agreement with Apple being able to use your voice recordings for any purpose.
The good news is that Siri deletes your records from the Apple server instead of tracking down a privacy dashboard as long as you turn off dictation as well.
To turn off the Siri dictation, go to Settings> Siri and turn off both Hey Siri and Siri. At the prompt, tap Turn Off. Note that the recordings will be saved even if the dication is deactivated.
Turn Dictation Off to Settings> General> Turn Off Keyboards and Dictation. At the prompt, tap Turn Off. Now it is confirmed that the recordings are deleted. (If you do this in reverse order, the alerts will be adjusted accordingly.)
Unfortunately, not all voices assistants are created the same , Siri has the right to delete your recordings the easiest, but Google will take over the crown, so you can prevent the storage and continue to use Google Assistant. Hopefully, they will learn from each other (or even steal each other) and control your data more closely.