Apple Abandons CSAM Scanning
After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
This is kind of surprising because it seemed designed to work on-device, alongside the end-to-end encrypted iCloud Photo Library that just arrived.
The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features, the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. Ultimately, the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications.
Previously:
- Advanced Data Protection for iCloud
- Apple Removes References to Controversial CSAM Scanning Feature
- Revised Messages Communication Safety Feature in iOS 15.2
- Apple Delays Child Safety Features
- The Risks of Client-Side Scanning
- Scanning iCloud Photos for Child Sexual Abuse
Update (2022-12-14): See also: Slashdot.
Update (2023-09-01): Tim Hardwick (Hacker News):
Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos.
Apple’s statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative’s demand that the company “detect, report, and remove” CSAM from iCloud and offer more tools for users to report such content to the company.
There was no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors.
Apple’s letter is good, but it basically just repeats the reasons that we were all screaming at Apple when they announced on-device scanning. You have to wonder why they decided to ship it in the first place.
Previously:
Update (2023-09-04): Nick Heer:
What is a little bit surprising is that Apple gave to Wired a copy of the email (PDF) Gardner sent — apparently to Tim Cook — and the response from Apple’s Erik Neuenschwander. In that letter, Neuenschwander notes that scanning tools can be repurposed on demand for wider surveillance, something it earlier denied it would comply with but nevertheless remains a concern; Neuenschwander also notes the risk of false positives.
[…]
One of the stories on Heat Initiative’s website concerns a man who abused his then-fiancé’s daughter in photo and video recordings, some of which were stored in his personal iCloud account. It is not clear to me how this case and others like it would have been discovered by Apple even if it did proceed with its proposed local CSAM detection solution as it would only alert on media already known to reporting authorities like NCMEC.
[…]
I simply think my own files are private regardless of where they are stored. Public iCloud photo albums are visible to the world and should be subject to greater scrutiny. Apple could do at least one thing differently: it is surprising to me that shared public iCloud albums do not have any button to report misuse.