When it comes to anonymous humans listening in on your audio clips and conversations, tech companies are finally getting their privacy settings in line.
But what’s the good of having options if you don’t know how to use them?
On Monday, Google unveiled new audio policy and settings that allow people to opt-in to allow human reviewers listen to their audio clips captured by Google Assistant. That came after revelations in July that some Google contractors were listening to and leaking recordings from Google Assistant.
Apple, Amazon, and, most recently, Facebook (through its Portal device), have all also undergone similar cycles: a report comes out that contractors are listening to audio clips, the tech company apologizes and pauses collection, and then eventually rolls out controls (and in some cases, policy changes) that would allow users to opt in or out to this sort of data collection.
So, why would anyone agree to let humans analyze their voices, and why were companies using contractors to listen in the first place?
“Opting in to Voice & Audio Activity (VAA) helps the Assistant better recognize your voice over time, and also helps improve the Assistant for everyone by allowing us to use small samples of audio to understand more languages and accents,” Google’s blog post on the new settings reads.
Essentially, humans listen to audio clips in order to transcribe them (or check the accuracy of automated transcriptions), and then feed the audio and its transcription back in to companies’ systems, to make their voice assistants smarter.
Google and its competitors are always improving natural language processing A.I. by delivering more real user data along with its matching meaning. By allowing Google, Apple, Amazon, and Facebook to listen to your conversations and commands, you’re helping their tech get smarter. Doing so is like being a good digital citizen (and the companies say they will work better for you, personally, by better recognizing your voice).
However, it is totally understandable if this is something you don’t want to do. Portions of Apple’s audio recordings actually leaked in Denmark. Plus, these transcribers are often contractors, which means they’re not necessarily as fully vetted as tech companies would like you to believe.
So now that the tech giants want to justify the practice by giving their users the option to participate, it’s time you make a proactive choice and exercise your data privacy rights, isn’t it?
Here’s how you opt in or out to voice data collection on Apple, Amazon, Facebook, and Google devices.
Google’s newly released controls are pretty easy to navigate. When you go to the “Your data in the assistant” page, scroll down to the “Voice & Audio Activity” box. By default, this should be “paused” (this sort of data collection is opt-in).
If you want to change the default setting, hover over the paused bar and click.
This will take you to another page where you can switch the toggle on if you want to allow Google storage and analysis of your voice commands. You can also choose to delete what it’s already recorded by clicking “Manage Activity.”
Transcription contractors might review clips collected by portal that begin “Hey Portal.”
Storage and review of “Hey Portal” audio recordings on Facebook’s Portal is opt-out, but you will be prompted to choose your settings the first time you log in if you’re a new Portal user. If you’ve already been using Portal, Facebook recently sent this push notification to users after making the adjustments:
People on existing devices will get a notification that explains how their voice data is used with a link to Settings where they can turn off storage. New Portal users will have the option to turn off storage of voice data when they set up their device for the first time.
You can go also go into your settings yourself. From Facebook:
You can also go directly to Settings on Portal or the Facebook Activity Log anytime to turn off storage. If you turn off storage, none of your “Hey Portal” voice interactions will be stored or reviewed by people.
In your Portal settings, you can turn off “Storage.” In your Facebook Activity Log (located underneath your cover photo), you can navigate to the Voice Interactions option in the left side bar, and click the “delete all voice interactions” button in the upper right hand corner.
Apple’s new settings aren’t out yet, but human review of transcriptions will be opt-in when they are. Apple has paused transcription review for the time being (since The Guardian reported that contractors “regularly hear confidential details” in July), and plans to roll out the new settings later this fall. All future transcription reviewers will be Apple employees, not contractors. Says Apple:
Users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Following an April report from Bloomberg, Amazon gave Alexa users the option to decide whether humans would review their voice commands and messages in August. Users can adjust the settings in the Alexa app or on the Alexa privacy settings page on desktop. This service is opt-out.
Once you get to Alexa privacy from settings, navigate to “Manage How Your Data Improves Alexa.” Then you can toggle two options off (or keep them on): “Help Improve Amazon Services and Develop New Features” and “Use Messages to Improve Transcriptions.”
Now doesn’t it feel good to be in informed control?