Wednesday, December 7, 2022

Apple Abandons CSAM Scanning

Apple (via MacRumors):

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

This is kind of surprising because it seemed designed to work on-device, alongside the end-to-end encrypted iCloud Photo Library that just arrived.

Lily Hay Newman:

The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features, the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. Ultimately, the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications.

Previously:

Update (2022-12-14): See also: Slashdot.

Comments RSS · Twitter

Leave a Comment