Google, Amazon & Apple take steps to stop human review of voice recordings
First Google was asked to stop human review of its voice AI recordings by the German privacy watchdog—in Europe. This follows a human review contractor working as a Dutch language reviewer, leak of over 1,000 Google Assistant recordings to a Belgian news website that reported being able to hear confidential information such as people's home addresses, conversation including medical conditions and emotional distress.
Google has since suspended the human review in Europe and it is unclear whether it will start that again.
Soon after, Amazon globally rolled out an option to disable human review of Alexa recordings following Bloomberg reporting on Amazon having a team of thousands of workers assigned to listen and transcribe audio recordings; many of which included sensitive and private information such as location details and personal data.
Apple also followed suit by halting the voice recordings review by its contractors and is working to offer the opt-out feature globally, following The Guardian's report revealing that contractors are commissioned to listen to private information.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”The Guardian’s source
The issue for all Google, Amazon and Apple is majorly over human review of these recordings without the consent or knowledge of people. While privacy policies do mention the analysis of the recordings sent to the respective companies, they do not mention human involvement.
However, to play the devil's advocate, voice assistant features are far from perfect. Anyone who has tried speaking to Siri or Alexa knows how they need to carefully use and enunciate words, and even then they often get it wrong. Moreover, they are often triggered mistakingly, termed as a "false positive."
These false positives and overall improvement of the AI voice assistant needs human involvement, as the systems by themselves can naturally not differentiate between genuine requests, accent differences and false positives. This is also a valid point illustrated by David Monsees, Google's product manager for search, when he wrote in his blog post that the manual reviews of Google Assistant queries are “a critical part of the process of building speech technology.”
“These reviews help make voice recognition systems more inclusive of different accents and dialects across languages. We don’t associate audio clips with user accounts during the review process, and only perform reviews for around 0.2% of all clips.”Google’s spokesperson
The issue then becomes that of privacy without consent. Which is what Johannes Caspar, the Hamburg commissioner for data protection, essentially said when he issued the following statement:
“The use of language assistance systems in the EU must comply with the data protection requirements of the GDPR. In the case of the Google Assistant, there are currently significant doubts. The use of language assistance systems must be done in a transparent way, so that an informed consent of the users is possible. In particular, this involves providing sufficient information and transparently informing those concerned about the processing of voice commands, but also about the frequency and risks of mal-activation. Finally, due regard must be given to the need to protect third parties affected by the recordings. First of all, further questions about the functioning of the speech analysis system have to be clarified. The data protection authorities will then have to decide on definitive measures that are necessary for a privacy-compliant operation. ”
It is interesting to note that while Amazon and Apple are launching their opt-out system globally, or at least working towards it, Google has only suspended its audio recording reviews by human contractors in Europe while further investigation is under process.