If you use Apple's AI-powered voice assistant Siri  — or own a Siri-enabled device — there's a chance a human on the other side of the world may be listening to you have sex right now.

Or they may be hearing that conversation you had with your boss about a new marketing strategy. Or that awkward exchange with your doctor about a really private medical problem.

That's the takeaway from a troubling new Guardian story in which an Apple whistleblower details how the company lets contractors review audio of users' Siri commands — as well as recordings never actually meant for Siri's digital ears — to improve the digital assistant.

To hear Apple tell it, though, the whole review process seems rather innocuous.

The company told The Guardian it regularly sends Siri activations to "secure facilities" where human reviewers listen to the clips, which are usually just a few seconds long and stripped of the Apple user's ID and name.

These contractors grade the audio snippets, noting whether the AI handled the request appropriately and any mistakes, such as Siri thinking it had heard its “wake word” when it didn’t.

Fewer than 1 percent of all Siri activations are subjected to this process, Apple said, and the goal is to improve Siri's ability to understand and assist users.

But as The Guardian discovered, there are several key problems with Apple's activation vetting process — and how the company describes it to users.

For one, Apple doesn't explicitly state in the privacy documentation it develops for consumers that humans might be listening in when they talk to Siri.

It also doesn't put much effort into hiring trustworthy contractors or making sure the audio clips can't be traced to their sources, the whistleblower told The Guardian.

"There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad," they said, later adding that "it’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings]."

Perhaps the most troubling revelation in The Guardian story, though, is the regularity with which these human reviewers hear audio that wasn't even meant for Siri.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the whistleblower said. "These recordings are accompanied by user data showing location, contact details, and app data."

The recordings can also be far longer than the few seconds Apple described to The Guardian, with the whistleblower noting that some can last upwards of 30 seconds.

It's not entirely surprising that Apple lets humans review audio recorded by their AI assistants, given that we already knew that Amazon and Google do the same thing.

And it also looks like digital assistants are not only here to stay, but that they'll be even more ubiquitous in the future, meaning these companies probably aren't going to stop attempting to perfect the tech any time soon.

So, if Apple, Google, Amazon, and the rest of the tech giants are determined to let humans review audio recorded by their AI assistants, maybe they should all focus on perfecting just one aspect of the tech first: training the assistants to listen only when spoken to.

READ MORE: Apple contractors 'regularly hear confidential details' on Siri recordings [The Guardian]

More on AI assistants: Amazon Workers Listen to Your Alexa Conversations, Then Mock Them


Share This Article