Apple NeuralHash CSAM scanning

Released: August 2021

Can you improve this page?
Share your insights with us

The Financial Times has revealed that Apple is planning to start scanning iPhones and iCloud in the US for child sexual abuse material (CSAM), with violations reported to law enforcement authorities. 

The new system will automatically scan devices to identify if they contain photos featuring child sexual abuse before the images are uploaded to iCloud, with matches reported to the US National Centre for Missing and Exploited Children (NCMEC).

The move, which drew considerable criticism, is seen to help law enforcement in criminal investigations but also to potentially open the door to more legal and government demands for user data. 

The reaction persuaded Apple to delay the rollout of its CSAM detection system and child safety features. 

Apple deleted all reference to the plan on its website in December 2021, and publicly dropped the system a year later.

Operator: Apple
Developer: Apple
Country: USA
Sector: Technology
Purpose: Detect child pornography
Technology: Perceptual hashing
Issue: Security; Privacy; Surveillance; Accuracy/reliability
Transparency: 

Page info
Type: Incident
Published: January 2022
Last updated: December 2022