Apple Sued for Not Searching iCloud for CSAM
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant’s mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to “identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services.” That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.
[…]
To build the case, survivors’ lawyers dug through 80 cases where law enforcement found CSAM on Apple products, identifying a group of 2,680 survivors as potential class members.
Previously:
- Apple Abandons CSAM Scanning
- Advanced Data Protection for iCloud
- Apple Removes References to Controversial CSAM Scanning Feature