Apple NeuralHash CSAM scanning

August 2021

The Financial Times has revealed that Apple is to start scanning iPhones and iCloud in the US for child sexual abuse material (CSAM), with violations reported to law enforcement authorities.

The move is seen to help law enforcement in criminal investigations but also to potentially open the door to more legal and government demands for user data.

Apple has since delayed the rollout of its CSAM detection system and child safety features.