Apple has announced plans to search people’s iPhones for child sex abuse material (CSAM) using new technology raising privacy concerns.
Overnight Apple confirmed that “NeuralHash” technology will allow them to detect known CSAM images stored in iCloud photos. The automated system will perform on-device checks of photos before they are uploaded to the iCloud.
The system checks for matches with CSAM from a database compiled by the National Center for Missing and Exploited Children (NCMEC) and alerts human reviewers if illegal content is found. If the image is verified the reviewer contacts law enforcement.
Matthew Green, a researcher at John Hopkins University, cautioned that repressive regimes worldwide could use the technology to surveil the public pointing out that whoever controls the database can search for whatever content they want.
In a Tweet he said: “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone.”
Apple said that the method was “designed with user privacy in mind” and claimed their technology provides “an extremely high level of accuracy” with less than a one in a trillion chance of an incorrect flagging occurring,
“This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM” said an Apple spokesperson in a blog post.
The new feature is due to be rolled out to US iPhones later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monetary.