Apple Vs. ICAC

Sep 30 / Admin




“How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution?”



- Erik Neuenschwander,
director of privacy and child safety, Apple
Apple has abandoned its plan to create a privacy-preserving tool for scanning iCloud photos for child sexual abuse material (CSAM) due to concerns about privacy and security. A new child safety group called Heat Initiative is demanding that Apple detect and remove CSAM from iCloud and provide more reporting tools for users.

In response, Apple outlined its decision to focus on on-device Communication Safety features rather than the CSAM scanning tool. Apple's director of user privacy and child safety, Erik Neuenschwander, explained that scanning all iCloud data would create security risks and potential privacy breaches. Heat Initiative is disappointed by Apple's decision, arguing that the company should take responsibility for detecting CSAM.

Apple is now emphasizing on-device nudity detection and providing an API for third-party developers to integrate Communication Safety features into their apps. Apple believes that connecting vulnerable users with local resources and law enforcement is a better approach than processing CSAM reports. This issue highlights the ongoing debate between child protection and user privacy, especially regarding encryption.

Read the full article in WIRED Magazine HERE