The Online Safety Bill(new window) is currently working its way through the UK parliament, and it’s expected to be passed into law this autumn. This wide-reaching piece of legislation would force any “user-to-user service” (such as TikTok, Facebook, and Twitter) or search engine that’s available online in the UK to protect all its users from illegal content and children from potentially harmful content.
This is one of the most urgent battles for the future of the internet. Finding ways to quickly remove hateful, harmful, and illegal content makes the internet safer for everyone. We’re happy to see that lawmakers are taking this problem seriously, and it remains a top priority for Proton. However, it’s unclear whether the Online Safety Bill would effectively tackle the problem.
It seems more likely that in its attempt to address all types of harmful content on all online platforms, the bill has resorted to generalities that are open to interpretation and could undermine personal privacy and security.
We believe that if UK lawmakers are serious about wanting to make “the UK the safest place in the world to be online while defending free expression”, they must significantly revise the Online Safety Bill. They must clarify what services and content the bill covers, eliminate the potential for harmful unintended consequences, and take steps to ensure this bill will not compromise end-to-end encryption.
To solve a problem, you must define it
You might be wondering why we’re commenting on a piece of legislation that seems to be targeting social media companies when we offer encrypted email, calendar, cloud storage, and VPN services. We feel we must speak up because of the influence this bill will have, not only in the UK but on the internet across the globe. While email services are excluded from the current text, online cloud storage services, like Proton Drive, are not. Furthermore, the Online Safety Bill could pave the way for future legislation that would target more services and push for even more far-reaching measures.
At this stage, the bill is so broad that it’s not entirely clear who would be subject to it. While primarily targeting social media companies, the bill defines “content” as anything that is “communicated publicly or privately”. In practice, as tech companies (like Proton) often offer single accounts encompassing a number of different services, it’s likely that services that are not meant to be subject to the law (like email) will inadvertently become subject to it by extension.
That essentially means that almost any online service that has users in the UK could be affected. It also means that messages you send your mom could be treated the same as something you post on social media for everyone to see, which comes dangerously close to violating UK citizens’ explicit right to a private life.
Another key area of confusion is that the bill will require online services, like Facebook, to enforce their terms of service or face government sanctions, including criminal liability and jail time for executives (Clause 65). Essentially, the UK government is outsourcing its duty to define harmful content to private companies, encouraging self-censorship. However, the UK government has the right to decide if a company isn’t enforcing their terms the way they’d like and can impose severe punishments.
This almost guarantees that companies will overcorrect and remove perfectly legal and otherwise protected speech from their platforms rather than risk being liable.
This is exactly what happened when the US enacted the FOSTA-SESTA bills(new window), ostensibly about preventing sex trafficking. However, the law’s unclear mandate caused broad-based censorship(new window) around anything that could possibly be used to “promote or facilitate prostitution”. For example, Craigslist shut down its personals section(new window), Reddit closed numerous subreddits(new window), and smaller websites simply shut down. As it’s currently written, the Online Safety Bill would have a similarly chilling effect on free speech.
A ban on end-to-end encryption in all but name
Clause 110 of the Online Safety Bill would allow the UK government to require any “user-to-user service” to use “accredited technology” to identify and remove child sexual abuse material (CSAM) or terrorist content “whether communicated publicly or privately by means of the service”.
In plain English, clause 110 would give the UK government broad powers that would allow it to require any online service available in the UK to monitor all user-generated content on its platform, including its users’ private messages.
This is a problem for end-to-end encrypted services, which can’t access their users’ content.
While UK lawmakers have stated they don’t want to ban end-to-end encryption, the only ways an end-to-end encrypted service could comply with the bill are:
- Remove its end-to-end encryption
- Weaken its end-to-end encryption
- Install client-side scanning
- Cease providing service in the UK
This would be an overt re-creation of the mass surveillance systems that Edward Snowden exposed back in 2013. Not only would this violate UK citizens’ right to privacy, but there’s little to no evidence that mass surveillance is effective(new window) at reducing crime or terrorism. Client-side scanning is already in place on some platforms like Google, and its false positives have had disastrous real-life consequences(new window). Additionally, this surveillance would also have a chilling effect on free speech and make it harder for journalists and whistleblowers to expose wrongdoing.
Weakening end-to-end encryption would reduce everyone’s safety online, including the children this bill is trying to protect(new window). Without strong encryption, the sensitive data of millions of people would be at risk.
The Online Safety Bill must be revised to tackle illegal content
We share the goal of making sure people are safe online.. We’re doing our utmost to ensure that illegal content has no place on our services, and we’ve participated in Tech Against Terrorism(new window) events over the years to share best practices. Our Anti-abuse team makes up roughly 10% of our staff, and they’re constantly investigating new and innovative ways of fighting abusive content while protecting our users’ privacy.
But this law leaves far too much up to interpretation. The advocates for the Online Safety Bill have noble intentions — we all want to stop the spread of CSAM and terrorist content — and we’re happy to see policymakers finally enter the debate on how to improve online safety. But their approach is misguided and dangerous.
The bill puts our right to privacy, our right to free speech, and the economic functioning of the internet at risk while doing little to protect people online. In fact, weakening encryption would put people at greater risk.
We call on the UK government to revise the Online Safety Bill to better protect the right to privacy, the right to free speech, and the encryption the internet relies upon to function. We also would point out that any program combatting CSAM should also involve greater funding for child protection services, counseling, and law enforcement,
If the law is passed, we will do everything within our power to comply while protecting our users. We founded Proton so that everyone can exercise their fundamental human right to protect their privacy online. As long as we can ensure the privacy of the Proton community in the UK, we will continue to operate there.
UPDATE March 6, 2023: Removed a reference to the Online Safety Bill applying to “large” companies. In fact, it would apply to companies of all sizes, placing a large technical burden on many medium-sized and small businesses.