EU flag on a black keyboard showing EU law for CSAM scanning of encrypted messages

Proposed EU Law Requiring CSAM Scanning Could Put an End To Encrypted Messages

The European Commission is considering a new child sexual abuse material (CSAM) law that could undermine all encrypted messages if enacted. If it were to become EU law, the proposal would require that all messaging services scan all of their messages for potential CSAM materials, something that would either require a removal of end-to-end encryption or the insertion of backdoors.

Broad EU law could mandate CSAM detection technologies, compromising all encryption

The European Commission’s proposal is new, but the concept is one that other countries have already grappled with: the idea of ending encrypted messages entirely, at least in a form that the government is not able to access, in the name of ending CSAM.

The proposal would create a new entity, the “EU Centre,” tasked solely with tracking down and removing CSAM from the internet. Assisting the EU Centre with this would be compulsory for businesses, and that assistance would be so invasive that true end-to-end encrypted messages and files would no longer be viable for them to provide to customers. The proposal calls for the EU Centre to be an entirely independent agency, but also says that it should share physical space with the headquarters of Europol.

If it becomes EU law the proposal would create sweeping new requirements for online businesses to “detect, report, block and remove” CSAM materials from their platforms. Commissioner for home affairs Ylva Johansson says that the extreme measure has become necessary as “gatekeeper” platforms such as Instagram and Snapchat have failed to voluntarily remove enough CSAM content.

The “detection” of CSAM is the element that promises to cause the greatest controversy. It would become a requirement under EU law for these businesses to proactively scan all messages and content passing through their servers for potential CSAM, something that is impossible to do with true end-to-end encrypted messages and files. The only answer would be for platforms to entirely stop offering encrypted messages, or to offer a weakened and vulnerable system of encryption that allows them a “backdoor” through which to meet their legal obligation to scan.

The EU law would not immediately apply to all relevant businesses; all companies in the messaging and file handling space would be required to conduct a CSAM risk assessment, and then “detection orders” might be handed to them by a regulator or court. The detection orders would be for a limited time period and eventually sunset, but all impacted platforms would need to build this scanning ability into their services in the event that they are handed such an order. The new EU Centre would settle on exactly what these scanning technologies would look like once it is formed.

As it stands, true end-to-end encrypted messages and files cannot be accessed by anyone but the parties exchanging them, even the platform handling them. The EU law would not only grant access to all messages to the platform (and by extension the government upon request), but could also create space for vulnerabilities to be exploited by outside threat actors to get into private messages.

“Backdoor” not the only issue; AI would attempt to detect intent of conversations

A government-mandated backdoor is a five-alarm privacy fire on its own. Unfortunately, the proposal does not stop there.

The proposed EU law would also require CSAM scans to include use of an AI algorithm to evaluate the intent of conversations for signs of child grooming. This means not only scanning encrypted messages for the known markers of CSAM files, but wholly scanning all text and evaluating it for the possibility of criminal intent.

Given this element, privacy advocates are calling the proposed EU law the biggest and most sophisticated surveillance system in the world. It would move beyond simply ending the privacy of encrypted messages to implement AI-driven “pre-crime” elements that bear a worrying resemblance to the film Minority Report. As Matthias Pfau, CEO of Tutanota, observes: “This would be the worst surveillance mechanism ever established outside of China, and all in the pretext of protecting children … Security experts agree that a “back door only for the good guys” is not possible.”

Though the privacy implications are all one really needs to argue against the proposal, there are also a number of other expected consequences that the Commission does not appear to have thoroughly considered as of yet. One is the inevitable massive administrative cost and labor burden placed on tech firms; the likes of Google and Facebook might absorb this, but they might be the only ones left standing. The way the EU law is worded also virtually guarantees that European law enforcement will be awash in CSAM false positives, requiring anything that is not “manifestly obvious (to not be CSAM)” to be reported.

If this proposal feels familiar, it echoes the client-side CSAM scanning that Apple explored last year. While that system was much less invasive than this proposal (and only impacted users making use of the iCloud service), it nevertheless met with such public backlash that Apple put the project on indefinite hold and deleted all references to it from its website.