Fraser Speirs (tweet):
The problem with Apple’s iOS education offerings that started to really make me wonder what the future held came when I realised that iTunes U was clearly just being left to die a slow death. At the time of writing, iTunes U still does not support basic iOS multitasking features that were introduced in iOS 9 - four releases ago. […] Whatever learning platform a school uses is a vital part of the work of the school and, if it’s not evolving, it’s dying. Make no mistake: iTunes U is a dying service and it would be more honest and respectable of Apple just to announce the date on which it will be put out of its misery.
[…]
The worst issue by far in iOS sysadmin is backup and restore of supervised devices. This process has never been properly documented and it seems to change freely with iOS versions. Every time I have to do it, it takes at least three hours of experimentation to get something that mostly works.
[…]
We’ve been using 9.7” iPad Pro hardware in this cycle and, while the hardware remains fast and capable, I have not been very pleased with durability. We have seen a lot of fatigue-related screen damage - that is, damage not caused by a catastrophic accident but rather just repeated put-downs in a schoolbag.
[…]
In the final analysis, I think that the long-term effect of tablets will be that they forced laptops to get better.
Previously:
Update (2019-08-01): Fraser Speirs:
Once you abstract your data and apps from the hardware, the hardware largely stops mattering. And if I was Tim Cook, that would keep me awake every night.
Fraser Speirs:
I think one of the key features that GSuite has here is that the user directory is integrated, so collaboration is easy because identity is ‘built in’ to the platform so to speak. iCloud doesn’t really have the same kind of thing.
Backup Education Google Chromebook iOS iOS 12 iPad iTunes U
Hugo Gutiérrez (Google translation, via Adrian Tineo):
The listening of private recordings is carried out through a company subcontracted by the apple company, just as Google does, as EL PAÍS already advanced. These reviewers are responsible for analyzing private conversations and requests made to the virtual assistant of Apple devices.
[…]
In the case of Apple transcriptionists, working conditions were much better than those of employees who performed this work for Google, although the work is almost identical. The reviewers consulted confirm that they did not charge for audio made, but had a monthly salary. “You could choose the number of hours hired. In my case I was part-time, 30 hours a week, and earned 1,100 euros gross per month.” Of course, they had a goal to meet audio heard of about 150 files per hour. That is, I had to review about 4,500 recordings a week.
[…]
In what there was a strict control was the number of recordings made, something that, in case of default, was grounds for dismissal. “They were modifying it several times in the months in which I was working for this company. In fact, in my last weeks there, the objective set was practically impossible to fulfill and they knew it, ”says a former employee.
It was previously reported that Apple had humans reviewing Siri audio data, but it was not known that they were contractors.
Alex Hern (MacRumors):
Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
[…]
The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
[…]
“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.
“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
Devin Coldewey:
Apple’s privacy policy states regarding non-personal information (under which Siri queries would fall):
We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.
It’s conceivable that the phrase “search queries” is inclusive of recordings of search queries. And it does say that it shares some data with third parties. But nowhere is it stated simply that questions you ask your phone may be recorded and shared with a stranger. Nor is there any way for users to opt out of this practice.
Jason Snell (tweet):
It doesn’t matter to me if this is Amazon or Apple. I don’t want human beings listening to the audio these devices record. In fact, I don’t want recordings made of my audio, period—I want the audio processed and immediately discarded.
Apple boasts constantly about taking user privacy seriously. There’s one right response to this report, and it’s to change its policies and communicate them clearly. A mealy-mouthed response about how the eavesdropping is done in a secure facility without an Apple ID attached is not good enough.
David Heinemeier Hansson:
Steve Jobs: “Privacy means people know what they’re signing up for, in plain English, and repeatedly... Let them know precisely what you’re going to do with their data.”
How many Siri users know that contractors are listening in when they intentionally or not trigger it?
Nick Heer:
Even so, there should surely be a way to opt out entirely and not allow any of your Siri conversations to be selected for review. It’s absurd that there seemingly isn’t a way to do this — turning off Siri entirely is not a solution — though I’ve reached out to confirm if disabling the analytics sharing options in Settings would opt users out. Also, as with Google, I have to question why users are not first asked whether a human can review their audio recording.
Previously:
Update (2019-08-01): Peter Cohen:
I’m unpacking this in real-time today so I apologize for the thread. But near as I can tell, Apple doesn’t give any way at all of excluding Siri recording samples from the data you share with Apple.
Michael Potuck (Hacker News):
However, almost 65% of 9to5Mac readers said they want an option to turn off the ability for Apple to record and listen to Siri activations in our recent poll.
Now, Jan Kaiser has shared an iOS profile to turn off logging of server-side Siri commands on GitHub (if you prefer to make your own profile to do this, head below).
[…]
Kaiser also notes that you can make your own profile to restrict Siri’s logging with Apple Configurator if you don’t want to download the one shared on GitHub.
Why doesn’t iOS have a built-in setting to control this? At the very least, it should honor the general setting to not send data back to Apple to help improve its products.
Update (2019-08-02): Matthew Panzarino (tweet, MacRumors, Bloomberg):
Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake.
In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not.
[…]
When this story broke, I dipped into Apple’s terms of service myself and, though there are mentions of quality control for Siri and data being shared, I found that it did fall short of explicitly and plainly making it clear that live recordings, even short ones, are used in the process and may be transmitted and listened to.
Russell Ivanovic:
What happens on your iPhone stays on your iPhone.*
*Until it turns it doesn’t. Then if the story doesn’t get much traction and you don’t notice we’ll still pretend it does. But when it blows up we’ll fix it. Clear?
Sam Gross:
I’m wondering whether there’s an internal story behind this. I’d bet a bunch of people said no, but some senior person, under pressure to improve Siri, said yes.
Phillip Molly Malone:
The issue isn’t that they do it! The issue is two fold, to me anyway:
- They are waging a holy war on privacy and making themselves the lord priests of it!
- They don’t offer the controls on the voice recording the experts in the field (Amazon and Google) do!
Update (2019-08-05): Dieter Bohn:
Apple’s handling of your Siri voice recordings is a really clear sign that its strident privacy stance has given the company a blind spot: when it DOES collect your data, it isn’t as good as everybody else at giving you controls for seeing and deleting it.
Update (2019-08-16): Sam Byford:
Apple has said that it will temporarily suspend its practice of using human contractors to grade snippets of Siri voice recordings for accuracy.
[…]
Apple did not comment on whether, in addition to pausing the program where contractors listen to Siri voice recordings, it would also stop actually saving those recordings on its servers. Currently the company says it keeps recordings for six months before removing identifying information from a copy that it could keep for two years or more.
See also: The Talk Show.
Nick Heer:
Plain-language explanations of practices that may be compromising to users’ privacy can be hard to write. I am certain that the opt-in rate would be extremely low if these devices asked users — during the onboarding process, for example — whether a selection of their voice recordings can be retained and later reviewed by a human being.
Nevertheless, it is unquestionably the right thing to do.
Update (2019-08-19): John Gruber (tweet):
Until the opt-in process is crystal clear, Apple should delete all existing recordings and confirm that it is no longer saving them. I don’t even know where to start with the fact that until this story broke, they were keeping copies with identifying information for six months. This defies everyone’s expectations of privacy for a voice assistant.
We should expect Apple to lead the industry on this front, but in fact, they’re far behind. Amazon has a FAQ written in plain language that explains how Alexa works, and how to view your voice recordings from Alexa-powered devices. You can review them in the Alexa app in Settings: Alexa Privacy (a pretty obvious location) or on the web. That settings page also has an option: “Use Voice Recordings to Improve Amazon Services and to Develop New Features”. I think Amazon should make clear that with this turned on, some of your recordings may be listened to by Amazon employees, but it’s not too hard to surmise that’s what’s going on.
Apple offers no such setting, and offers absolutely no way to know which, if any, of our Siri recordings have been saved for review by employees. This is something we should have explicit, precise control over, but instead it’s a completely black box we have no control over or insight into whatsoever.
Update (2019-08-29): Jay Peters (Hacker News):
For the Siri contractors, transcribing 1,000 voice commands means they likely had to do about two per minute, assuming they were working an eight-hour day.
Apple (MacRumors):
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
John Gruber:
Apple also has a “Siri Privacy and Grading” FAQ, written in very clear language. Basically, Apple is admitting they fucked up on this grading thing, they’re owning up to it, and are committed to doing everything they should have been doing all along to protect users’ privacy and make everything as clear as possible to users.
Matthew Panzarino:
Apple says that it will continue using anonymized computer-generated written transcripts of your request to feed its machine learning engines with data, in a fashion similar to other voice assistants. These transcripts may be subject to Apple employee review.
In other words, if Siri was able to transcribe what you said, Apple will still retain any sensitive information you may have uttered. I don’t find the fact that it’s in text form rather than audio to make that much of a difference. There doesn’t seem to be a way to opt out of this, except by not using Siri at all.
Nick Heer:
A ramification of these changes is that hundreds of contracted workers in Ireland were laid off. That’s a horrible result for so many people. It reinforces that employees at tech companies need to carefully consider the impact of their product or service.
Apple Apple Configurator Firing HomePod iOS iOS 12 Privacy Siri
Andrew Griffin (via Phil Schiller, Hacker News):
“I can tell you that privacy considerations are at the beginning of the process, not the end,” says Craig Federighi, Apple’s senior vice president of software engineering. “When we talk about building the product, among the first questions that come out is: how are we going to manage this customer data?”
[…]
It might seem unlikely that any normal phone would be subjected to this kind of beating, given the chance of their owners going through an environment that chills them to -40C or heats them to 110C. But the fear here is not normal at all. If the chips were found to be insecure under this kind of pressure, then bad actors would immediately start putting phones through it, and all the data they store could be boiled out of them.
If such a fault were found after the phones make their way to customers, there would be nothing Apple could do. Chips can’t be changed after they are in people’s hands, unlike software updates. So it looks instead to find any possible dangers in this room, tweaking and fixing to ensure the chips can cope with anything thrown at them.
[…]
Apple, by design, doesn’t even know which of its own employees it is harvesting [health] data about. The employees don’t know why their data is being harvested, only that this work will one day end up in unknown future products.
Nick Heer:
This claim goes uncontested by Griffin, but it’s wrong. All iCloud data created by Chinese users is stored in China; even the iCloud user agreement for Chinese users is between the user and GCBD, not the user and Apple. Also, Apple’s software actively encourages customers to use iCloud services from a few moments after they power up a device for the first time. It is therefore misleading, at best, to state that Apple collects less data. The company may not collect behavioural data to the same extent as its competitors, but that does not apply to user-provided data.
The next paragraph is similarly misleading[…]
[…]
After re-reading this, it’s clear that my disputes are with the reporter’s explanations, not Federighi’s.
John Gruber (tweet):
Google and Facebook are both pushing back against Apple, arguing that Apple’s stance on privacy is only possible because they charge a lot of money for their products.
I think the point that needs to be made is that free and low-cost products can be subsidized by privacy-respecting advertising — but privacy-respecting advertising is not as profitable as privacy-invasive advertising, as exemplified on Facebook and Google’s humongous platforms.
Advertising Apple China Craig Federighi iOS iOS 12 Privacy Secure Enclave
Kif Leswing (via John Voorhees, MacRumors):
Unlike content moderators at Silicon Valley companies such as Facebook or YouTube that rely on tens of thousands of contractors, Apple’s app reviewers work for Apple, people familiar with the process said. They’re paid hourly, have employee badges and get Apple benefits such as health care. Everyone starts out reviewing iPhone apps, and as reviewers become more senior, they are trained to evaluate apps with in-app purchases, subscriptions, Apple Watch and Apple TV.
[…]
App Review is organized under the marketing umbrella at Apple and always has been, even before Schiller took over the greater App Store marketing and product departments in late 2015. Although Schiller is involved in decision-making through the ERB, people who worked at the App Review office said that he rarely if ever visits the office where the review takes place.
According to people familiar with app review operations, day-to-day oversight mainly falls to a vice president at Apple, Ron Okamoto, and a director who joined Apple when it bought TestFlight in 2015, whom CNBC is not naming because of security reasons. Reviewers say they sometimes receive feedback from developers that can be threatening.
[…]
Reviewers have daily quotas of between 50 and 100 apps, and the number of apps any individual reviewer gets through in an hour is tracked by software called Watchtower, according to screenshots seen by CNBC. Reviewers are also judged on whether their decisions are later overturned and other quality-oriented stats.
Previously:
App Review App Store Apple iOS iOS 12 Mac Mac App Store macOS 10.14 Mojave Phil Schiller TestFlight