EU plan to force messaging apps to scan for CSAM risks millions of false positives, experts warn

EU plan to force messaging apps to scan for CSAM risks millions of false positives, experts warn

 

 

The controversial proposal by European Union lawmakers to mandate messaging platforms to scan private communications for child sexual abuse material (CSAM) is facing intense scrutiny from over 270 security and privacy experts. In an open letter issued Thursday, concerns were raised regarding the potential for millions of false positives daily if such scanning were enforced.

 

This pushback against the EU’s plan has been ongoing since the Commission initially proposed the CSAM-scanning strategy two years ago. Experts, lawmakers within the European Parliament, and even the bloc’s own Data Protection Supervisor have all voiced alarm over the proposal’s implications.

Under the EU’s proposal, messaging platforms would not only be required to scan for known CSAM but also employ unspecified detection scanning technologies to identify unknown CSAM and grooming activity in real-time. Critics argue that such requirements are technologically unfeasible and could jeopardize internet security and user privacy, particularly with the deployment of unproven technologies like client-side scanning.

 

The letter emphasizes that there is currently no technology capable of meeting the proposed legal demands without causing significant harm. Despite these concerns, the EU is continuing to pursue the implementation of the plan.

Addressing recent amendments to the draft CSAM-scanning regulation proposed by the European Council, the signatories argue that the changes fail to address fundamental flaws within the proposal.

 

Among the 270 signatories are numerous academics, including renowned security experts like Professor Bruce Schneier of Harvard Kennedy School and Dr. Matthew D. Green of Johns Hopkins University. Additionally, several researchers from tech giants such as IBM, Intel, and Microsoft have lent their support to the letter.

A previous open letter signed by 465 academics last July highlighted the vulnerabilities and flaws of the proposed detection technologies, warning that they could significantly weaken the protections offered by end-to-end encrypted communications.

 

Little traction for counter-proposals

Last autumn, Members of the European Parliament (MEPs) came together to propose a significantly altered approach to the issue. This revised approach aimed to restrict scanning to individuals and groups suspected of child sexual abuse, limit it to both known and unknown CSAM (removing the grooming detection requirement), and exclude end-to-end encrypted platforms from the scanning mandate. However, the European Council, another key legislative body in EU policymaking, has yet to take a definitive stance on the matter, and its decision will heavily influence the final legislation.

 

The most recent amendment, presented by the Belgian Council presidency in March, is currently under discussion. However, the experts caution that this proposal still fails to address fundamental flaws inherent in the Commission’s approach. They argue that the revisions would grant unprecedented surveillance and control capabilities over internet users, posing significant threats to digital security and democratic processes.

 

Although the amended Council proposal includes suggestions for more targeted detection orders and measures to protect cybersecurity and encryption, the experts assert that these adjustments merely scratch the surface of the security and privacy concerns. They warn that, from a technical perspective, the proposal would severely compromise communications and system security. Relying on flawed detection technology to identify cases of interest does not mitigate the risk of the law facilitating widespread surveillance of web users’ messages, according to their analysis.

 

Additionally, the letter addresses a Council proposal to mitigate the risk of false positives by defining a “person of interest” as a user who has shared CSAM or attempted to groom a child. This determination would likely be made through automated assessment, such as requiring one hit for known CSAM or two for unknown CSAM/grooming before the user is flagged as a suspect and reported to the EU Centre responsible for handling CSAM reports.

 

Billions of users, millions of false positives

The experts caution that despite proposed adjustments, this approach is still likely to generate a significant number of false alarms. They explain that the sheer volume of messages exchanged on platforms like Meta-owned WhatsApp, which boasts billions of users, means even a small false positive rate could result in millions of incorrect identifications daily.

 

For instance, they illustrate that if a detection system for CSAM and grooming had a false positive rate of just 0.1% (meaning one in a thousand messages is incorrectly flagged), with WhatsApp’s daily message count of 140 billion, there could be 1.4 million false positives every day, assuming only 1 in a hundred messages is subjected to testing. Achieving a reduction in false positives to the hundreds would necessitate identifying at least five repetitions using different images or detectors. However, when considering other messaging platforms and email, this becomes statistically improbable.

 

Regarding another proposal to limit detection orders to “high-risk” messaging apps, the signatories argue this revision is ineffective since it would still impact a vast number of users indiscriminately. They note that features essential for CSAM exchange, such as image sharing and text chat, are common across many service providers, implying that a high-risk categorization would affect numerous services.

 

Moreover, they highlight the increasing adoption of end-to-end encryption (E2EE), suggesting that services implementing E2EE are more likely to be classified as high risk. This risk classification could further escalate with the introduction of message interoperability requirements mandated by the Digital Markets Act (DMA), potentially leading to the classification of nearly all services as high risk due to increased message flow between different platforms.

 

A backdoor for the backdoor

Regarding encryption protection, the letter reiterates the longstanding message from security and privacy experts: the detection of content in end-to-end encrypted services inherently undermines encryption safeguards. Despite the new proposal’s stated goal of safeguarding cybersecurity and encrypted data while including end-to-end encryption services in detection orders, the experts stress that this objective is contradictory.

They emphasize that the essence of end-to-end encryption lies in ensuring that only the intended recipient can access communication content. Introducing detection capabilities, whether for encrypted or pre-encrypted data, directly contradicts this fundamental principle of confidentiality.

In recent weeks, police chiefs across Europe have issued a joint statement expressing concerns about the expansion of end-to-end encryption and urging platforms to design security systems that still enable the identification of illegal activity and the reporting of message content to law enforcement. While they deny advocating for encryption backdoors, they have yet to specify the technical solutions they seek for enabling lawful access. This ambiguity places the onus back on lawmakers to reconcile these conflicting demands.

The letter’s signatories warn of catastrophic consequences if the EU continues along its current path, assuming the Council does not alter its course despite calls from MEPs. They caution that such actions would set a precedent for internet filtering, limit individuals’ ability to safeguard their privacy online, and have a chilling effect on digital interactions, especially among teenagers. Furthermore, they predict that such measures would significantly alter global digital service usage patterns and could detrimentally impact democracies worldwide.

While an EU source close to the Council could not provide details on ongoing discussions among Member States, they confirmed an upcoming working party meeting on May 8 to discuss the proposal for a regulation to combat child sexual abuse.

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!