Apple Delays Child Safety Features
Joseph Cox (tweet, Hacker News, The Verge, MacRumors, TechCrunch):
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in the statement.
It sounds like they are delaying, indefinitely, both the Messages and iCloud Photos components.
My suggestions to Apple:
(1) talk to the technical and policy communities before you do whatever you’re going to do. Talk to the general public as well. This isn’t a fancy new Touchbar: it’s a privacy compromise that affects 1bn users.
(2) Be clear about why you’re scanning and what you’re scanning. Going from scanning nothing (but email attachments) to scanning everyone’s private photo library was an enormous delta. You need to justify escalations like this.
(3) As Nick says, client-side scanning is icky to people. There is a reason for this. Considering the number of privacy invasions users have learned to live with, the pushback on this line means something. Learn from it.
(4) Privacy-preserving cryptographic protocols aren’t going to distract people from the fact that what you’re trying to do is uncomfortable.
And (5) if you’re going to make your system design public, make all of it public. Withholding NeuralHash and then having it REed, broken: that was a catastrophe.
There’s also the issue of the secondary server-side hashing algorithm, which Apple seems not to have mentioned until after people started criticizing NeuralHash. Are there other key components not mentioned in the whitepaper?
To me client side scanning is THE issue. Server side, do whatever you want. But MY device should be MINE, and only do what I tell it and/or act for my benefit.
Scan things on “sharing” them, not on “storing” them.
Cindy Cohn (via Edward Snowden):
EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely.
If you think Apple lacks the backbone to resist political pressure for expanding the CSAM matching database, you definitely cannot hope for wholly encrypted iCloud storage without any way of detecting abuse.
[…]
I am curious about the company’s next steps, though. […] I look forward to a solution that can alleviate many researchers’ concerns, but if — as with the App Store — trust has been burned. Only Apple can rebuild it.
The other possibility is that the entire effort is now tainted, making this “delay” just a face-saving way for Apple to drop the technology like the hot potato it became. Would there be a massive public outcry if 2022’s Worldwide Developer Conference came and went with no mention of CSAM detection in iOS 16?
It’s a loss for Apple because all they managed to do is piss off everyone.
Previously: