Group of people in Apple Store showing mass surveillance concerns with iCloud Photos scan

Apple’s New Plan To Scan iCloud Photos Raises Concerns About Mass Surveillance

An announcement by Apple of a new plan to automatically scan iCloud photos for sexual abuse has sparked fears that the system may be used for mass surveillance. Apple’s proposed new “CSAM” system is not all that different from image recognition systems used by other major cloud storage providers dating back nearly a decade, but privacy advocates are concerned that it creates an opening for US law enforcement agencies to make demands of the company and could support mass surveillance in other countries.

Device-side scanning of iCloud photos raises privacy concerns

Apple’s new Child Sexual Abuse Material (CSAM) system is currently in the testing phase, but the company has committed to rolling it out and expects it to be included in an iOS 15 update sometime in 2021. Named “NeuralHash,” the system was revealed in a series of tweets by Matthew Green, a cryptography professor at Johns Hopkins University. It will roll out in the United States initially, but there is not yet a target date for any other countries. Automated child abuse detection systems are banned in some parts of the world, such as the European Union.

The system is not all that dissimilar from those used by other major cloud services providers (such as Google and Dropbox) dating as far back as 2013. Apple says that NeuralHash differs in some key ways, however. It appears that it will scan images on the device side if they are shared with iCloud; however, Apple says that the system is actually more private than those commonly in use as it employs cryptography and only scans file hashes rather than using image recognition technology (a technique called “private set intersection”). Apple will add a list of hashes of known child abuse file collections to user devices (provided by the National Center for Missing and Exploited Children), and will then check iCloud photos against these hashes.

Though Apple claims that its new iCloud photos system is more private and secure for the end user than any similar alternative, the announcement nonetheless set off alarms about mass surveillance. This was amplified by the scanning taking place on the device side, leading to a quick spread of rumors and misinterpretations indicating that Apple intended to passively scan all user files on the device at all times.

Another issue is that the CSAM system allows Apple to remotely disable decryption on an iCloud account that is flagged. However, the company says that users will not be flagged unless they meet a certain (undisclosed) count of file matches among their iCloud photos. When this happens, the user’s iCloud account is decrypted and Apple staff will manually review it to determine if child abuse images are present. If they are, the iCloud account is disabled and Apple says that it will notify law enforcement “if necessary.” Apple has also said that there will be an appeals process in place in the event of mistakes.

Mass surveillance fears stoked by announcement

While much of the initial fear around the internet was generated by the misinterpretation of Apple announcing that it would passively scan all files on all devices at all times for sexual abuse materials, privacy advocates raised some plausible scenarios in which the CSAM system could lead to mass surveillance.

One is that law enforcement could pressure Apple to decrypt iCloud photos with this system in other scenarios, possibly by using an investigation of sexually explicit photos as a pretext. Another is that other types of image hashes might be added to the system; for example, pictures of political protesters. Apple responded to these concerns by asserting that the system can only work with image hashes provided by the NCMC. A spokesperson said that the company will not have the ability to manually add new hashes to CSAM and that outside cryptography experts would be brought in to verify that the system works as advertised.

Apple also posted a notice to its website that it would reject any government demands for mass surveillance, that it will not be decrypting Messages and that end-to-end encryption on devices will still be fully functional. It also said that the CSAM scans could be entirely disabled by disabling iCloud Photos and verified that it would not scan local photo collections not selected for upload to iCloud. However, the statement did not directly address the issue of potential law enforcement requests for access to files via CSAM in individual investigations.

One other concern that is not addressed involves Apple’s stated policy of obeying national government laws, with privacy advocates wondering if the company would acquiesce to China’s CCP or another authoritarian government that engages in mass surveillance if it demanded that the CSAM system have hashes of political dissident pictures added to it.

Apple has a strong track record of resisting government requests to break end user encryption or insert backdoors for law enforcement, but it has weakened slightly on its mass surveillance policies as of late. In 2020, Apple scrapped a plan to fully encrypt user phone backups to iCloud after the FBI complained that it would hamper their investigations.