usage worldwide

The analysis of our voices by Google is perfectly acceptable, but not listening in to our most intimate moments without our knowledge

People at Google are listening to what we say into our phones and tell our smart speakers. Google performs voice analyses to improve the artificial intelligence of its ‘devices’ and thus improve their services. But nowhere are we told that people are also listening. Google occupies such a prominent place in our lives that the ethical consideration of not sharing that information is unacceptable, Tim Verheyden believes.

expert
Rupert De Weerd
Tim Verheyden
Social media expert

People can be delightful, especially when they think that no one is listening to them, such as the driver who asked Google to: ‘Send a text message: Sweetheart, ich liebe dich!’, and said that latter with the requisite love and affection. Beautiful. Touching. But someone is listening. That excerpt was among the recordings I listened to, courtesy of an employee of a company that listens to that sort of message for Google. I call him Peter. I listened to more than a thousand recordings, and among them I discovered many wrongfully recorded private conversations.

Lots of people talk to their phones. To Apple's Siri, for example. But also to Google Assistant. You can ask it anything: to give you the latest news, order groceries, send messages and so on, ad infinitum. Recently a Belgian version of Google Home, Google’s smart speaker, has been introduced. You can put it in your living room and have it do all sorts of additional tasks for you: turn the heating on and off, dim the lights, book you a holiday… 

You give Google voice commands via Google Assistant – an app that is primarily available on mobile and smart-home devices – or via Google Home. Google stores those commands. That option is active by default. In its privacy policy, Google says that it uses those recordings to make its artificial intelligence even smarter. After all, isn’t that what we all want: a quick response to our search, a fast answer to our question, or instant directions from Waze (a GPS navigation app also owned by Google)? And to have all this, we relinquish part of our privacy.

Peter, one of the people listening to the recordings

What Google doesn’t tell us in its terms and conditions is that people like Peter are listening to excerpts from those recordings. Peter listens to Dutch language recordings, but he has hundreds of other-language colleagues all over the world. Google’s computers literally write down every word we say. Peter and co. then get to hear and read selected excerpts. Peter has to check whether the written text is accurate, revise it if it isn’t, and signal anything that is unintelligible or has been recorded incorrectly.

Peter doesn’t know who is behind the voice – the excerpts are anonymous – but it’s sometimes not too hard to find that out. While listening to some recordings, a colleague of mine recognises an old friend from the youth movement. It is striking that the excerpts Peter gets to hear haven’t been made anonymously unrecognisable: the sentences are clearly pronounced and there are clear mentions of people’s names and references to familiar companies. 

Questions

This recognisability raises questions. Did the smartphone of the persons concerned respond incorrectly to a clearly entered command? Did a clear question produce an unclear answer or an incorrect search? Why doesn't Peter transcribe dialect in clear Dutch? Or is this just an additional check? Moreover, Peter and many of his colleagues are undoubtedly honest and upright, but what if a less noble colleague passes on recognisable user information – such as account numbers or medical information – to interested third parties outside Google?

Google has undoubtedly given such pertinent questions some thought. But there isn’t any mention of them in Google’s privacy declaration. Google also doesn’t tell us that people are listening to our conversations and to what we ask our smartphone or smart speaker. It only says that 'audio may be analysed'.

Among the more than 1,000 excerpts that I heard, there were 153 conversations that should never have been recorded and during which the command "Okay, Google" was clearly not given.

Peter

Despite his confidentiality agreement, Peter was willing to tell me about his work, because he thinks it’s important that people should know what happens when we give Google a voice command, and also because Google Home – and especially  the Belgian version – is still very prone to mistakes.

"Among the more than a thousand excerpts that I heard, there were 153 conversations that should never have been recorded and during which the command ‘Okay, Google’ was clearly not given. They were everyday living-room conversations. For example, a woman asked her husband: ‘What’s keeping Frans? He was supposed to be here by ten and it’s already eleven o'clock.’ A mother shouted at her daughter from the bottom of the stairs: ‘Noortje, just you come down here and say that!’ Or a conversation between two women who agreed: 'Boys aren’t as quick as girls.' There were dozens of such examples."

It's pretty striking that I had to sign a whole list of secrecy conditions

Peter

"It’s pretty striking that I had to sign a whole laundry list of – yes – secrecy conditions. Absolutely nothing must get out about all those things. That is often the case when you work with companies like that. And that when they themselves are quite surreptitiously collecting lots of personal conversations and personal sound clips and letting other people listen to them," Peter tells me.

Most of the excerpts I hear are about fairly innocent, everyday things. But what if Google records a sensitive and intimate private conversation? Such as the mother I heard who asked her son how his wound is healing, or a quarrelling couple. Do you want outsiders to listen in to such things? 

And what if you have a delicate conversation with your doctor, who happens to use Google services? Even worse, what if the Google transcriber is your neighbour or your second cousin, who recognises your voice? These are, of course, fictional examples. But it is a real possibility.

I don't believe Google has bad intentions - but the system makes lots of mistakes

Peter

It is not clear what caused those first examples to be recorded by Google’s smart speaker, but it shouldn’t happen. I live in a fairly open flat. If the TV is on in the living room, you can hear it perfectly in the bedroom. I’d love to install the Google Home I’ve bought in my flat because I love that sort of gadgetry. But there is no way I’m letting it through the door. I don't want to run the risk of someone at Google listening to an excerpt of what I’m doing in my bedroom, or to any conversations I have at home. 

"I don’t believe Google has any bad intentions," says Peter. "But the system – and once again, especially Google Home – makes lots of mistakes. That system is highly sensitive: as soon as someone in the vicinity utters a word that sounds a bit like Google, it starts to record. I estimate that some 30 percent of the conversation excerpts are recorded unintentionally. Very often they’re about trivial things. But sometimes they’re about highly personal matters, such as relationship problems. Sometimes, they are blazing rows."

Professional listeners also often pick up snatches of medical issues, because people also use Google for medical and health related searches. And the cast-iron law of the Internet is also in force here: men seem to search for pornography remarkably often, also via Google Home.

Under no circumstances can medical data be listened to in this way

Researcher Jef Ausloos

"The new privacy laws are very clear," says researcher Jef Ausloos. "Under no circumstances may medical data be listened to in this way. But the other examples given are also unacceptable. Google should work the way scientists do when they are conducting medical tests: they should test their systems in a controlled environment, using volunteers. Or, at the very least, they should do tests to improve their artificial intelligence with people who are fully aware of their participation and its purpose. Google is not transparent about what happens to the audio recordings; we don't know what is being recorded, how many seconds are being listened to, or how long those recordings are saved."

The law says it is illegal to eavesdrop on conversations, but that there is no ban on recording a conversation in which you are participating. But if Google records conversations between people in a living room, then that is definitely an issue. And the psychological aspect is even more important than the legalities. Do we want others listening to what we say into our phones or smart speakers? We have therefore decided to broadcast some of those excerpts that we heard on radio and television. But we have, of course, distorted them a bit and made them unrecognisable. 

This is not a plea for a life without technology, but there should be far more clarity about how we are surreptitiously being watched and listened to every day

Tim Verheyden

This is not a plea for a life without technology. Technology makes so many aspects of our lives more agreeable. Technology is an essential part of our society: in our smartphones, in our cars, in our workplaces... In hospitals, technology saves lives. It would be pointless to even try to sum up everything that technology does for us. Imagine a world without that technology. It doesn’t bear thinking about.

Speech-processing technology will play an increasingly prominent role in this, so analyses of our voices have to be carried out in every possible language spoken in the world today.

But a tech giant such as Google now occupies such a prominent place in our lives that there has to be far more clarity about how we are surreptitiously being watched and listened to every day. And if we are bombarded with smart speakers, then we mustn’t only have eyes and ears for the convenience and coolness of these gadgets; we have to stop and literally ask ourselves what we are bringing into our homes. New laws have recently been passed to govern this issue, so: "Okay, Google, how do you really deal with privacy?"

Related content