Apple NeuralHash CSAM scanning
Released: August 2021
Can you improve this page?
Share your insights with us
The Financial Times has revealed that Apple is planning to start scanning iPhones and iCloud in the US for child sexual abuse material (CSAM), with violations reported to law enforcement authorities.
The new system will automatically scan devices to identify if they contain photos featuring child sexual abuse before the images are uploaded to iCloud, with matches reported to the US National Centre for Missing and Exploited Children (NCMEC).
The move, which drew considerable criticism, is seen to help law enforcement in criminal investigations but also to potentially open the door to more legal and government demands for user data.
The reaction persuaded Apple to delay the rollout of its CSAM detection system and child safety features.
Apple deleted all reference to the plan on its website in December 2021, and publicly dropped the system a year later.
Operator: Apple
Developer: Apple
Country: USA
Sector: Technology
Purpose: Detect child pornography
Technology: Perceptual hashing
Issue: Security; Privacy; Surveillance; Accuracy/reliability
Transparency:
System
News, commentary, analysis
https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f5411f
https://www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html
https://www.theverge.com/2021/8/5/22611721/apple-csam-child-abuse-scanning-hash-system-ncmec
https://news.sky.com/story/apple-to-scan-iphones-for-images-of-child-abuse-12374092
https://9to5mac.com/2021/09/03/apple-delays-csam-detection-feature/
https://www.theregister.com/2021/12/16/apple_deletes_csam_scanning_plan/
https://www.theverge.com/2022/12/7/23498588/apple-csam-icloud-photos-scanning-encryption
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
Page info
Type: Incident
Published: January 2022
Last updated: December 2022