Apple NeuralHash CSAM scanning
Updated: January 2022
The Financial Times has revealed that Apple is planning to start scanning iPhones and iCloud in the US for child sexual abuse material (CSAM), with violations reported to law enforcement authorities.
The new system will automatically scan devices to identify if they contain photos featuring child sexual abuse before the images are uploaded to iCloud, with matches reported to the US National Centre for Missing and Exploited Children (NCMEC).
The move, which has drawn considerable criticism, is seen to help law enforcement in criminal investigations but also to potentially open the door to more legal and government demands for user data.
The reaction persuaded Apple to delay the rollout of its CSAM detection system and child safety features. In December 2021, Apple deleted all reference to the system on its website.
Purpose: Detect child pornography
Technology: Perceptual hashing
Issue: Security; Privacy; Surveillance; Accuracy/reliability