Radar AI Training
Marko Zivkovic (via Ric Ford, Reddit):
Apple announced its plans for a new opt-in Apple Intelligence training program. In essence, users can let Apple use content from their iPhone to train AI models. The training itself happens entirely on-device, and it incorporates a privacy-preserving method known as Differential Privacy.
The opt out seems to be via the Share iPhone & Watch Analytics button, which is the iOS equivalent of the Mac button that Mysk demonstrated Apple doesn’t actually honor.
In a social media post, developer Joachim outlined a new section of Apple's privacy notice in the Feedback application. When uploading an attachment as part of a bug report, such as a sysdiagnose file, users now need to give Apple consent to use the uploaded content for AI training.
After a long time, I filed another bug report using Feedback Assistant because the bug was bad enough that it’s worth the effort of writing it all down.
When uploading a sysdiagnose (or probably any other attachments) you get the usual privacy notice that there is likely a lot of private and other sensitive info in those log files. It’s not a great feeling but it is what it is with diagnostic data and I mostly trust the folks at Apple to treat it with respect and I trust the Logging system to redact the most serious bits.
However, when filing a feedback today a noticed a new addition to the privacy notice:
“By submitting, you […] agree that Apple may use your submission to [train] Apple Intelligence models and other machine learning models.”
WTF? No! I don’t want that. It’s extremely shitty behavior to a) even ask me this in this context where I entrust you with my sensitive data to help you fix your shit to b) hide it in the other privacy messaging stuff and to c) not give me any way to opt out except for not filing a bug report.
I could understand if the plan were for Apple to train some kind of internal AI model to help them triage bugs. Some developers might still have a problem with this because they don’t want their private data leaking out of the context of their particular bug. But when Apple says Apple Intelligence models that sure sounds like training the general models that will be available to the general public.
They probably have something in the terms of service that allows them to retroactively do this for previously submitted bugs, going back decades. Really, the only solution for keeping your data private is not to share data—even for internal use by the Privacy Company—that you don’t want to be shared. That is, only submit sysdiagnoses from a clean test Mac.
Also, there is a lot of sensitive information in a sysdiagnose. Taking it and throwing it into a big pile of data and compute and hoping something useful comes out of it is not treating my data with the respect it deserves.
On the topic of Radar, also see this thread by Max Seelemann:
Apple’s disrespect for the time and energy going into developer bug reports is making me sad. 🙁
Reported a performance issue with a sample app a couple of months ago. Of course, no feedback.
And now, Beta 2, they just ask if it’s still present and a sysdiagnose. They could have just launched the sample themselves and would have seen that NOTHING has changed. My guess is that no single developer at Apple has ever seen the issue and they just randomly ask about this out of procedure? Depressing.
My model of the radar world is that they tag reports like “Finder icon position” or “… performance” and the devs add tags to their commits. Whenever a release contains a commit where the tags match, you automatically get those “please verify” mails.
Like “if we touch a part of the code that is closely related to a report, just ask the reporter if we fixed it as a side effect.”
I doubt this is the case because I’ve had bugs that did get fixed but where I never got this e-mail, even though really rough tagging would have made my bugs match. Or maybe some percentage of bugs just never get tagged.
The best is when they personally reach out via DM and then you make them an example and you NEVER hear back.
My favorite is when they do write back once and say that you can ask for updates on the bug, and then each year you ask for an update and never ever hear anything again.
Previously:
- macOS Tahoe 26 Developer Beta 2
- Apple Turnaround
- Soured
- Private GitHub Data Lingers in Copilot Training
- Martin Pilkington, RIP
- Feedback Feedback
- AI Companies Ignoring Robots.txt
- Apple Intelligence Training
- Updated Adobe Terms of Use
- Slack AI Privacy
- Tumblr and WordPress to Sell Users’ Data to Train AI Tools
- Reddit AI Training Data and IPO
- Zoom ToS Allowed Training AI on User Content With No Opt Out
- GrammarlyGO Training on User Content With Questionable Opt Out
- ChatGPT Is Ingesting Corporate Secrets
- Lawsuits Over Apple Analytics Switch
1 Comment RSS · Twitter · Mastodon
Whenever my old addiction of bringing issues to Apple’s attention returns, I either remember Corbin Dunn’s post from 2019 or read it again as part of my yearly inoculation.
https://mjtsai.com/blog/2019/03/11/the-sad-state-of-logging-bugs-for-apple/