On October 26, the UK Parliament passed the Online Safety Act, giving Ofcom, the UK’s telecoms regulator, broad powers to search for, find, and suppress harmful media and speech by scanning the internet and, despite widespread condemnation from the technology industry, even end-to-end encrypted (E2EE) messages.
The advocates of this law have pushed for it as a strong response to prevent some of the worst kinds of online abuse, specifically those that target and affect children. However, the government has since admitted there currently is no technically feasible way to scan E2EE messages or services without breaking their encryption, something the government claims it won’t do. As a result, the responsibility shifted to Ofcom to decide on a responsible way to implement its new powers, keeping in mind the technical and moral limitations.
Ofcom has now published a draft consultation around how it could look to enforce the Online Safety Act. There are some positive steps in this – Ofcom has shown that it is not inherently against end-to-end encryption, and has made exemptions for email and encrypted messaging in a move that is certainly a victory for privacy.
However, its proposal to use hash matching to fulfill the mission of the Online Safety Act presents several significant risks to UK citizens’ privacy. Furthermore, the consultation implies that file storage and sharing services may be required to implement hash scanning, but we’re yet to see the details of how this may be implemented in practice.
Despite Ofcom’s claims that this proposal would preserve privacy, hash matching is not a magical solution. If it’s implemented, it could create a mass surveillance apparatus that could easily be abused by law enforcement.
What is hash matching?
Hash matching, or hash scanning, compares certain pieces of content such as videos, pictures or text, to a database of illegal content. It is done by turning the content into “hashes”, a sample of the content a bit like a fingerprint. The hashes of content stored or shared by a specific user are then compared to the hashes of known illegal content in a database and result in a match if the software deems that the hashes are identical or similar enough. Some of these databases are developed by private companies, others by NGOs, or even law enforcement agencies.
Hash matching is a dangerous step toward mass surveillance
Hash matching sounds simple, but several concerning issues become apparent.
- Hash matching is an incredibly difficult technology to implement, and similar systems already in place have returned numerous false positives that can ruin people’s lives(nowe okno). These false positives would put law-abiding users at risk and bog the system down, forcing companies or law enforcement to investigate perfectly innocent media, potentially diverting resources away from real cases of abuse.
- While Ofcom has exempted encrypted and private messages from this law, every app you download to share files or access social media could contain spyware to examine the media on your device and report it.
- It’s unclear whether encrypted files stored on the cloud would be subject to hash scanning under the Online Safety Act. Now that we live in a cloud-first world, this could give the government the mandate to scan everyone’s files — which often includes people’s most sensitive information.
- Finally, none of the proposals currently discuss how the general public can verify what the database of illegal material will contain. It could very easily become a tool of censorship, similar to how the Chinese government scans for images of the Tiananmen Square protests.
Ofcom must proceed judiciously
We urge Ofcom to very carefully consider the implications of pushing forward with hash matching. We still object to the Online Safety Act, but we will give credit where it is due. Since the Bill became law, Ofcom’s proposal to limit scanning to content shared publicly, and the government’s admission that there’s currently no feasible way to scan E2EE messages without breaking encryption, demonstrate they understand the stakes and technical limitations.
That being said, any proposal to implement hash matching, especially one with so few details, opens the people of the UK to an even greater violation of their right to privacy.
It’s critical that Ofcom heeds the warnings from the tech community. We will fight to uphold the right to privacy, we will work with regulators to ensure they understand the risks of undermining it, and we will not comply with anything that does. We will always protect the rights of the Proton community, wherever they may live. We will share our thoughts on the proposals in Ofcom’s consultation as soon as we have more details.
While it should be acknowledged that we’re still waiting on more details from Ofcom, we hope that lawmakers in Europe take note of the UK’s efforts to protect end-to-end encryption and continue the EU Parliament’s efforts to improve Chat Control. The European Parliament is expected to publish even more privacy-conscious proposals than Ofcom. We hope this gives the UK regulator the courage to further improve the Online Safety Act in the coming months.