Apple NeuralHash CSAM scanning system raises privacy concerns

Occurred: August 2021

A plan by Apple to start automatically scanning iPhones and iCloud accounts in the US for child sexual abuse material (CSAM) using perceptual hashing was heavily criticised by privacy advocates and others as overly intrusive. 

An August 2021 Financial Times article revealed Apple's NeuralHash system would have automatically scanned devices to identify if they contain photos featuring child sexual abuse before the images are uploaded to iCloud. Matches were to be reported to the US National Centre for Missing and Exploited Children (NCMEC).

The plan had been seen as potentially helpful for law enforcement in criminal investigations, but critics feared it might open the door to unnecessary or disproprotionate legal and government demands for user data.

Widespread hostility persuaded Apple to delay the rollout of its CSAM detection system.

➕ December 2021. Apple deleted all reference to the plan on its website.

December 2022. Apple publicly dropped the system.

System 🤖


Documents 📃

Operator: Apple
Developer: Apple
Country: USA
Sector: Technology
Purpose: Detect child pornography
Technology: Hash matching; Deep learning; Neural network; Machine learning
Issue: Security; Privacy; Surveillance; Accuracy/reliability
Transparency: 

Page info
Type: Issue
Published: January 2022
Last updated: September 2023