Monday, July 29, 2019

Apple Contractors “Regularly Hear Confidential Details” on Siri Recordings

Hugo Gutiérrez (Google translation, via Adrian Tineo):

The listening of private recordings is carried out through a company subcontracted by the apple company, just as Google does, as EL PAÍS already advanced. These reviewers are responsible for analyzing private conversations and requests made to the virtual assistant of Apple devices.

[…]

In the case of Apple transcriptionists, working conditions were much better than those of employees who performed this work for Google, although the work is almost identical. The reviewers consulted confirm that they did not charge for audio made, but had a monthly salary. “You could choose the number of hours hired. In my case I was part-time, 30 hours a week, and earned 1,100 euros gross per month.” Of course, they had a goal to meet audio heard of about 150 files per hour. That is, I had to review about 4,500 recordings a week.

[…]

In what there was a strict control was the number of recordings made, something that, in case of default, was grounds for dismissal. “They were modifying it several times in the months in which I was working for this company. In fact, in my last weeks there, the objective set was practically impossible to fulfill and they knew it, ”says a former employee.

It was previously reported that Apple had humans reviewing Siri audio data, but it was not known that they were contractors.

Alex Hern (MacRumors):

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

[…]

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

[…]

“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

Devin Coldewey:

Apple’s privacy policy states regarding non-personal information (under which Siri queries would fall):

We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.

It’s conceivable that the phrase “search queries” is inclusive of recordings of search queries. And it does say that it shares some data with third parties. But nowhere is it stated simply that questions you ask your phone may be recorded and shared with a stranger. Nor is there any way for users to opt out of this practice.

Jason Snell (tweet):

It doesn’t matter to me if this is Amazon or Apple. I don’t want human beings listening to the audio these devices record. In fact, I don’t want recordings made of my audio, period—I want the audio processed and immediately discarded.

Apple boasts constantly about taking user privacy seriously. There’s one right response to this report, and it’s to change its policies and communicate them clearly. A mealy-mouthed response about how the eavesdropping is done in a secure facility without an Apple ID attached is not good enough.

David Heinemeier Hansson:

Steve Jobs: “Privacy means people know what they’re signing up for, in plain English, and repeatedly... Let them know precisely what you’re going to do with their data.”

How many Siri users know that contractors are listening in when they intentionally or not trigger it?

Nick Heer:

Even so, there should surely be a way to opt out entirely and not allow any of your Siri conversations to be selected for review. It’s absurd that there seemingly isn’t a way to do this — turning off Siri entirely is not a solution — though I’ve reached out to confirm if disabling the analytics sharing options in Settings would opt users out. Also, as with Google, I have to question why users are not first asked whether a human can review their audio recording.

Previously:

Update (2019-08-01): Peter Cohen:

I’m unpacking this in real-time today so I apologize for the thread. But near as I can tell, Apple doesn’t give any way at all of excluding Siri recording samples from the data you share with Apple.

Michael Potuck (Hacker News):

However, almost 65% of 9to5Mac readers said they want an option to turn off the ability for Apple to record and listen to Siri activations in our recent poll.

Now, Jan Kaiser has shared an iOS profile to turn off logging of server-side Siri commands on GitHub (if you prefer to make your own profile to do this, head below).

[…]

Kaiser also notes that you can make your own profile to restrict Siri’s logging with Apple Configurator if you don’t want to download the one shared on GitHub.

Why doesn’t iOS have a built-in setting to control this? At the very least, it should honor the general setting to not send data back to Apple to help improve its products.

Update (2019-08-02): Matthew Panzarino (tweet, MacRumors, Bloomberg):

Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake.

In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not.

[…]

When this story broke, I dipped into Apple’s terms of service myself and, though there are mentions of quality control for Siri and data being shared, I found that it did fall short of explicitly and plainly making it clear that live recordings, even short ones, are used in the process and may be transmitted and listened to.

Russell Ivanovic:

What happens on your iPhone stays on your iPhone.*

*Until it turns it doesn’t. Then if the story doesn’t get much traction and you don’t notice we’ll still pretend it does. But when it blows up we’ll fix it. Clear?

Sam Gross:

I’m wondering whether there’s an internal story behind this. I’d bet a bunch of people said no, but some senior person, under pressure to improve Siri, said yes.

Phillip Molly Malone:

The issue isn’t that they do it! The issue is two fold, to me anyway:

  1. They are waging a holy war on privacy and making themselves the lord priests of it!
  2. They don’t offer the controls on the voice recording the experts in the field (Amazon and Google) do!

Update (2019-08-05): Dieter Bohn:

Apple’s handling of your Siri voice recordings is a really clear sign that its strident privacy stance has given the company a blind spot: when it DOES collect your data, it isn’t as good as everybody else at giving you controls for seeing and deleting it.

Update (2019-08-16): Sam Byford:

Apple has said that it will temporarily suspend its practice of using human contractors to grade snippets of Siri voice recordings for accuracy.

[…]

Apple did not comment on whether, in addition to pausing the program where contractors listen to Siri voice recordings, it would also stop actually saving those recordings on its servers. Currently the company says it keeps recordings for six months before removing identifying information from a copy that it could keep for two years or more.

See also: The Talk Show.

Nick Heer:

Plain-language explanations of practices that may be compromising to users’ privacy can be hard to write. I am certain that the opt-in rate would be extremely low if these devices asked users — during the onboarding process, for example — whether a selection of their voice recordings can be retained and later reviewed by a human being.

Nevertheless, it is unquestionably the right thing to do.

Update (2019-08-19): John Gruber (tweet):

Until the opt-in process is crystal clear, Apple should delete all existing recordings and confirm that it is no longer saving them. I don’t even know where to start with the fact that until this story broke, they were keeping copies with identifying information for six months. This defies everyone’s expectations of privacy for a voice assistant.

We should expect Apple to lead the industry on this front, but in fact, they’re far behind. Amazon has a FAQ written in plain language that explains how Alexa works, and how to view your voice recordings from Alexa-powered devices. You can review them in the Alexa app in Settings: Alexa Privacy (a pretty obvious location) or on the web. That settings page also has an option: “Use Voice Recordings to Improve Amazon Services and to Develop New Features”. I think Amazon should make clear that with this turned on, some of your recordings may be listened to by Amazon employees, but it’s not too hard to surmise that’s what’s going on.

Apple offers no such setting, and offers absolutely no way to know which, if any, of our Siri recordings have been saved for review by employees. This is something we should have explicit, precise control over, but instead it’s a completely black box we have no control over or insight into whatsoever.

Update (2019-08-29): Jay Peters (Hacker News):

For the Siri contractors, transcribing 1,000 voice commands means they likely had to do about two per minute, assuming they were working an eight-hour day.

Apple (MacRumors):

As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

John Gruber:

Apple also has a “Siri Privacy and Grading” FAQ, written in very clear language. Basically, Apple is admitting they fucked up on this grading thing, they’re owning up to it, and are committed to doing everything they should have been doing all along to protect users’ privacy and make everything as clear as possible to users.

Matthew Panzarino:

Apple says that it will continue using anonymized computer-generated written transcripts of your request to feed its machine learning engines with data, in a fashion similar to other voice assistants. These transcripts may be subject to Apple employee review.

In other words, if Siri was able to transcribe what you said, Apple will still retain any sensitive information you may have uttered. I don’t find the fact that it’s in text form rather than audio to make that much of a difference. There doesn’t seem to be a way to opt out of this, except by not using Siri at all.

Nick Heer:

A ramification of these changes is that hundreds of contracted workers in Ireland were laid off. That’s a horrible result for so many people. It reinforces that employees at tech companies need to carefully consider the impact of their product or service.

1 Comment RSS · Twitter


¯\_(ツ)_/¯

Apple employees are now part of the iOS privacy threat model.

Google employees are now part of the Android privacy threat model.

Amazon.... you get the idea. These corps are as ripe for anti-trust as Microsoft was, back in the day.

Leave a Comment