Proton

Why client-side scanning isn’t the answer

Law enforcement agencies generally don’t like end-to-end encryption because it blocks them from accessing the private communications of individual citizens. After decades of trying to make tech companies add a “backdoor” to encryption, they’re now shifting strategies and focusing on other technological solutions that they claim would allow for scanning in end-to-end encrypted environments without breaking encryption.

One of the technologies they refer to is called “client-side scanning”. However, client-side scanning does not preserve privacy in the way its proponents claim. In fact, it could enable new and powerful forms of mass surveillance and censorship. Client-side scanning would put free speech and everyone’s security at great risk without producing any great benefits for law enforcement.

Client-side scanning doesn’t technically break encryption because it doesn’t need to: By the time your data is encrypted, authorities have already seen it. 

Why authorities came up with client-side scanning

To understand client-side scanning, you need to understand end-to-end encryption. 

Privacy-focused tech companies like Proton use end-to-end encryption to make your data inaccessible to anybody but you. Not even Proton can see your information because it’s encrypted before ever leaving your device. Only you or the person you’re talking to can decrypt the data. This prevents Proton’s servers from accessing your data, but it also blocks out hackers and governments. 

End-to-end encryption technology was developed in part by the US government to increase security online. It enables the internet as we know it, used for everything from online banking to protecting dissidents from oppressive regimes. But law enforcement agencies claim that it’s preventing them from doing their jobs. 

The FBI and other agencies have called this “going dark” to suggest that they’re blind to criminal activity because of encryption. In the past, they have advocated requiring tech companies to purposely create a vulnerability in their encryption — a so-called backdoor for law enforcement. But as Proton and other privacy advocates have pointed out, there is no such thing as a backdoor that only lets the good guys in. If there is a key that opens the private communications of millions of people, hackers will steal it.

Policymakers in most countries have generally sided with the privacy advocates, and there is no backdoor law.

So lately, authorities that want more access have begun to focus on client-side scanning as a new silver bullet, an alternative they say will protect user privacy and does not undermine end-to-end encryption. But if you know how client-side scanning works, it’s easy to understand why it’s potentially worse than a backdoor.

What is client-side scanning, and how does it work

The term client-side scanning refers to several technical methods to analyze the contents of a person’s messages on their device. This can include images, videos, and text messages. Typically, the content is checked against a database of prohibited content and flagged for moderation if there’s a match. 

The European Commission published a legislative proposal(new window) in May 2022 that would mandate companies to actively search for child sexual abuse material on their services. While tech companies would be given some latitude to choose how to comply with this requirement, client-side scanning is often presented as the best technology available by law enforcement officials.

The problem with that, as the Electronic Frontier Foundation pointed out(new window), is that the most privacy-preserving methods of client-side scanning are almost impossible to implement. 

So the most likely solution would be local hash matching, in which the digital fingerprints of your messages are compared against the digital fingerprints of prohibited content stored in a database on your device. In theory, the database would contain the hashes for child sexual abuse material or materials related to terrorism, but in practice the database could contain anything. And you would have no way of knowing.

That means whenever you download an app, even if it’s end-to-end encrypted like Signal or WhatsApp, you would be downloading a toolkit designed to inspect all your pictures and text messages and potentially report that data to the developer or the government. 

This is why client-side scanning is even worse than a backdoor. Law enforcement wants to use backdoors in encryption to scan the content shared with others, but client-side scanning would allow them to look at the content you store on your device, whether you share it or not.

Ways client-side scanning can go wrong

Client-side scanning for the purpose of monitoring people’s communications is tailor-made for abuse. Here are just a few of the ways it can go wrong:

Increased attack surface for hackers

Widespread adoption of client-side scanning would open up new opportunities for hackers to monitor people’s communications. And because the attacks would most likely take place on individual users’ devices, app developers would be less able to intervene to protect their users. 

In 2021, a group of well-known computer scientists released a paper(new window) describing the technical failings of client-side scanning as a law enforcement solution. One of their key findings is that client-side scanning destroys trust: “The software provider, the infrastructure operator, and the targeting curator must all be trusted. If any of them — or their key employees — misbehave, or are corrupted, hacked or coerced, the security of the system may fail. We can never know when the system is working correctly and when it is not.”

Government censorship and persecution

If regulators begin requiring client-side scanning for the world’s tech companies, there is nothing that prevents it from a technical standpoint from being used to monitor any type of content. Though it may be initially used to look for child abuse or terrorism, a successive government could use it to target broader categories of content, raising a greater risk of false positives and attacks on freedom. 

It’s easy to see how some governments could use this technology to target political opponents, journalists, and anyone else they deem objectionable. It would be left to private app developers to either comply with such orders or shut down their service. 

Either way, as soon as client-side scanning is deployed, users who rely on private and secure apps would have no way to trust their communications are protected.

False positives

Even in the best-case scenario, in which responsible governments and well-meaning developers implement a narrow form of client-side scanning, the technology is simply not sophisticated enough to identify only the targeted material. There are already cases of false positives that have ruined people’s lives, such as the father who sent a picture of his son to their doctor and was flagged as a predator(new window) and reported to the police.

In a letter(new window) opposing the European proposal, over 100 privacy advocacy organizations expressed specific concerns about client-side scanning. Child-abuse survivors describing their trauma to a trusted adult could be flagged for monitoring and reported. Anyone who sends an intimate picture could have that picture mistakenly flagged and examined by a third party. 

Conclusion

Despite the way client-side scanning is optimistically portrayed as a magical solution, it’s not a privacy-preserving alternative to encryption backdoors. And in many ways, it’s worse. Authorities would gain the technical ability to scan everyone’s unencrypted data at all times. Using any chat app or social media platform would mean installing spyware on your device.

Meanwhile, the flood of false positives would likely overwhelm app developers and law enforcement agencies alike. Suddenly tasked with combing through people’s private pictures and messages, resources will be diverted away from actually preventing criminal activity.

We believe there are many ways to prevent abuse without breaking encryption or using mass surveillance. In fact, mass surveillance is notoriously ineffective(new window) at preventing crime. And we have previously written about how we address abuse on our platform

Law enforcement agencies should be empowered to stop real criminals using proven methods without destroying security and freedom for everyone.

Related articles

Mockup of the Proton Pass app and text that reads "Pass Lifetime: Pay once, access forever"
en
Learn more about our exclusive Pass Lifetime offer. Pay once and enjoy premium password manager features for life.
A cover image for a blog announcing that Pass Plus will now include premium SimpleLogin features
en
We're changing the price of new Pass Plus subscriptions, which now includes access to SimpleLogin premium features.
Infinity symbol in purple with the words "Call for submissions" and "Proton Lifetime Fundraiser 7th Edition"
en
It’s time to choose the organizations we should support for the 2024 edition of our annual charity fundraiser.
A collage of images depicting the function of Suggesting mode for Docs in Proton Drive
en
  • For business
  • Product updates
  • Proton Drive
Gather feedback, track changes, and more with Docs in Proton Drive, a secure alternative to Google Drive from the privacy experts at Proton.
how to write a professional email
en
Easy steps and examples for writing a professional email. See how Proton Mail can make your emails stand out.
Email etiquette: What it is and why it matters |
en
Find out what email etiquette is with key rules and examples, why it is important, and how Proton Mail can help.