Maybe you weren’t aware that real people listen to the things you say to your Google Home and Google Assistant—or maybe you just don’t care. Either way, one of the humans hired to review and transcribe recordings to train the technology leaked over one thousand Assistant recordings to a Belgian news organization called VRTNews, who then published a story and a video about it, according to Gizmodo.
Google, who had no idea about the rogue employee until the story was published, is understandably unhappy. The leak didn’t unveil much new or surprising information, but it does remind consumers of their least favorite part of home AI—inevitable surveillance—and invades privacy not just by listening to calls, but by distributing them. The Google subcontractor who leaked the recordings also allowed journalists to look at the software used in the reviewing and transcribing process.
Google responded with a blog post explaining their reviewing process and condemning the contractor:
“We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
Google went on to explain that only “.2 percent of all audio snippets” are reviewed by humans. There are 1 billion devices on Google Assistant, so .2 percent is not insignificant.
The most concerning moment of the VRT report is it’s discovery that many of the Google Assistant recordings happened on accident. Recording is only supposed to happen when the user says “Hey Google,” indicating that the user wants to utilize the technology’s services. The leak, however, included 153 recordings that should have never been recorded in the first place. That means Google products are listening to you more than they say they are.
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!
Leave a Reply