It’s the EU’s turn to admit Chat Control won’t work

Last week, the UK government made a statement in the House of Lords acknowledging(new window) that portions of the controversial Online Safety Bill might not even be technically enforceable without breaking end-to-end encryption. This rightly received a lot of attention, as it represented a significant shift in the UK Government’s position. 

The law, as drafted, would require tech companies to somehow scan messages for abusive materials. However, as Proton and other privacy advocates have repeatedly explained(new window), there’s no way to do this without destroying end-to-end encryption for everyone. While the Government’s statement fell short of legal changes to the text (something that we still believe to be vital), it did represent an important victory in its admission that there is no such thing as tech that can scan everyone’s online activity while also providing safety and privacy. It therefore brought the UK in line with long held expert consensus, and any statements to the contrary are a fantasy. 

Unfortunately, the European Commission has offered no such public acknowledgement as it continues to push forward its proposal, commonly referred to as Chat Control. Chat Control is ostensibly a measure to fight child sexual abuse, which we can all agree is abhorrent. But rather than focusing on individuals suspected of engaging in these criminal activities, the text assumes that everyone using a specific service is guilty by default — not because they have done something wrong, but because they are using a particular service. 

Fighting crime while protecting privacy

This is a significant departure from judicial measures taken in the offline world. There are many ways to combat crime online without violating the rights of an entire continent. The European Commission’s draft goes even further than the UK’s plans, including provisions that could effectively ban end-to-end encryption for an even wider selection of services, including messengers, email providers, file storage services, and other platforms. 

Like the Online Safety Bill, Chat Control tries to confront the serious problem of illegal content by creating another serious problem: blowing up the right to privacy. 

Lawyers from different European institutions have already said candidly(new window) that Chat Control would “lead de facto to a permanent surveillance of all interpersonal communications”, which is illegal in the EU. As the Council and the Parliament consider their position on the European Commission’s proposal over the coming weeks, it’s vital that lawmakers in Brussels and the European capitals now follow legal recommendations and amend the text accordingly.

What they mean by ‘permanent surveillance’

For years, governments around the world have targeted tech companies in the name of national security, fighting terrorism, or protecting children. Whatever the reason, their proposed solutions too often rely on some form of mass surveillance or backdoor to encryption.

It’s the same story with the Online Safety Bill and Chat Control. Each proposal empowers regulators to force companies to break their own encryption by way of client-side scanning — a way of scanning messages before they’re sent to the recipient — or some other hypothetical technology that doesn’t exist in reality. The problem is there’s no way to implement these methods while preserving privacy.

Whenever you break end-to-end encryption on your platform for one person, it breaks it for everyone. Not only does this destroy customers’ trust in your service, it invites hackers to find vulnerabilities and steal as much data as they can. There’s no such thing as a backdoor that only lets the good guys in.

The irony is that breaking encryption on the most popular platforms won’t prevent illegal activities from happening online. Criminals will simply move to other secure, non-cooperative platforms, or run their own encryption software (much of which is open source), to keep on conducting their illegal activities outside of the public eye.

Toward safety and privacy in the EU

Proton has been very clear: We would take legal action should we receive any request to break our encryption. Leaving aside the fact that these requests would very likely be illegal under European law, giving us grounds for legal action, it would be unacceptable for us to undermine our encryption and the safety of all users, businesses, and organizations that count on us, both in the EU and around the world.

But we’re not giving up on the European Parliament and Council doing the right thing. We know from speaking with lawmakers in Brussels that there is a growing opposition to the proposals and an understanding of the dangers the draft legislation presents. 

However, “understanding” is not enough. The Council and the Parliament are currently working on their respective positions, and are expected to adopt them in the coming weeks. It’s vital that they take into account the current scientific and technological state of play, and amend the text by introducing strong safeguards for encryption, end-to-end encryption, and fundamental rights in general.

Europe has set a global privacy standard thanks to the GDPR, and with NIS2 it also has a leading position in cybersecurity and support for encryption. The EU needs to build on this leadership rather than undermine it. It’s perfectly possible to fight crime while upholding privacy and encryption. We must find a balance between protecting society and protecting civil rights. 

Protect your privacy with Proton
Create a free account

Related articles

People and companies are generally subject to the laws of the country and city where they are located, and those laws can change when they move to a new place. However, the situation becomes more complicated when considering data, which can be subjec
Your online data is no longer just used for ads but also for training AI. Google uses publicly available information to train its AI models, raising concerns over whether AI is even compatible with data protection laws. People are worried companies
iPhone stores passwords in iCloud Keychain, Apple’s built-in password manager. It’s convenient but has some drawbacks. A major issue is that it doesn’t work well with other platforms, making it hard for Apple users to use their passwords and passkeys
There are many reasons you may need to share passwords, bank details, and other highly sensitive information. But we noticed that many people do this via messaging apps or other methods that put your data at risk. In response to the needs of our com
Large language models (LLMs) trained on public datasets can serve a wide range of purposes, from composing blog posts to programming. However, their true potential lies in contextualization, achieved by either fine-tuning the model or enriching its p
is Google Docs secure
Your online data is incredibly valuable, particularly to companies like Google that use it to make money through ads. This, along with Google’s numerous privacy violations, has led many to question the safety of their information and find alternative