Apple NeuralHash CSAM scanning

Occurred: August 2021

Can you improve this page?
Share your insights with us

A plan by Apple to start automatically scanning iPhones and iCloud accounts in the US for child sexual abuse material (CSAM) using perceptual hashing was heavily criticised by privacy advocates and others as overly intrusive. 

An August 2021 Financial Times article revealed Apple's NeuralHash system would have automatically scanned devices to identify if they contain photos featuring child sexual abuse before the images are uploaded to iCloud. Matches were to be reported to the US National Centre for Missing and Exploited Children (NCMEC).

The plan had been seen as potentially helpful for law enforcement in criminal investigations, but critics feared it might open the door to unnecessary or disproprotionate legal and government demands for user data.

Widespread hostility persuaded Apple to delay the rollout of its CSAM detection system. The company deleted all reference to the plan on its website in December 2021, and publicly dropped the system a year later.

Operator: Apple
Developer: Apple
Country: USA
Sector: Technology
Purpose: Detect child pornography
Technology: Hash matching; Deep learning; Neural network; Machine learning
Issue: Security; Privacy; Surveillance; Accuracy/reliability
Transparency: 

Page info
Type: Issue
Published: January 2022
Last updated: September 2023