The UK's Online Safety Act Just Made E2E Encryption Illegal
The United Kingdom's Online Safety Act passed in October 2023 with bipartisan support, a long list of civil society objections, and a provision called Section 121. Section 121 authorizes Ofcom, the UK's communications regulator, to require "regulated user-to-user services" to use "accredited technology" to identify and remove illegal content on their platforms. The text of the provision is vague about what this technology looks like. The technical reality is not vague at all.
There is no accredited technology that can scan end-to-end encrypted content without either decrypting the content or inserting scanning at the client endpoint, which is technically equivalent to breaking the encryption model. The industry has been saying this clearly for three years. The government has been ignoring it.
This spring, Ofcom issued its first Section 121 notices to major messaging platforms. The text of the notices is not public, but the existence of the enforcement action has been confirmed by multiple affected companies through legal filings and regulatory disclosures. The theoretical question of whether the Online Safety Act is compatible with end-to-end encryption is no longer theoretical. It is being adjudicated in real time, and the platforms have a limited set of responses.
None of the responses preserve the thing that users signed up for.
What Actually Changes
Platforms served with Section 121 notices have three practical options. Each one is worse than the last.
The first option is to comply by implementing client-side scanning. In this approach, the user's device scans messages before they are encrypted and flags content matching specified databases of known illegal material. Compliance advocates describe this as "preserving encryption" because the scanning happens pre-encryption, and the encrypted channel itself is not compromised. This framing is misleading. Client-side scanning means the platform has code running on the user's device that reports on the content of their communications to a third party. The encryption no longer protects the user from the platform. It only protects the user from everyone else, which is a fundamentally different privacy model than what end-to-end encryption was supposed to provide.
The second option is to withdraw from the UK market. Signal and several smaller secure messengers have publicly stated they will leave the UK rather than implement scanning. WhatsApp has signaled the same intention in ambiguous terms, though Meta has not committed to withdrawal. iMessage's situation is complex because Apple's hardware position in the UK makes withdrawal expensive and politically fraught. The platforms that leave protect their users by leaving them without service. The platforms that stay protect their service by reducing the privacy guarantees.
The third option is to challenge the enforcement notices in court. This is slower, more expensive, and does not resolve the underlying legal conflict. Even a successful challenge to a specific notice does not prevent Ofcom from issuing future notices under slightly different factual circumstances. The structural tension between the law and the technology remains regardless of how any individual enforcement action is resolved.
Why the Quiet Timeline
The Online Safety Act passed in 2023, but the substantive enforcement has been delayed by multiple rounds of consultation, accredited technology certification processes, and political reshuffling. The government spent 2024 and 2025 hoping that platforms would voluntarily implement the expected scanning infrastructure. They did not. The consultation processes produced largely negative technical assessments. The "accredited technology" has not materialized because the underlying engineering problem is not solvable in any way that preserves the properties the law describes itself as preserving.
By spring 2026, the political patience has run out. The enforcement notices are arriving. The platforms are being forced into the decision they have been trying to postpone.
This is not accidental timing. The government has waited long enough that the law's opponents have largely moved on, the news cycle has shifted, and a new political administration has inherited the enforcement decision without the burden of the original debate. The same pattern that produced the original law is producing its execution: the surface-level rationale is child safety, the operational effect is mass communication surveillance infrastructure, and the debate about whether these two things are compatible has been deliberately structured to make opposition appear to oppose child safety.
This is not a coincidence of drafting. It is the technique.
What Signal Leaving Actually Costs
Signal's threat to leave the UK is often framed as a drastic protest gesture. It is not a protest gesture. It is a technical necessity. Signal is a non-profit organization that publishes open-source client software and runs its own servers. It does not have the engineering bandwidth or institutional flexibility to implement client-side scanning while maintaining its existing architecture. More importantly, the moment Signal ships a client-side scanning component, every other jurisdiction watching the UK will request the same capability, and the entire privacy proposition of the product collapses globally.
For Signal, there is no regional compliance model that does not compromise the global product. Leaving the UK is the only option that preserves the rest of the user base.
What UK users lose is significant. Signal is the messaging platform that journalists, political organizers, lawyers communicating with clients, victims of domestic abuse, and dissidents abroad rely on for genuinely private communication. Other platforms that remain in the UK with client-side scanning installed will provide a version of privacy that is adequate for most users and inadequate for the specific users who most need the stronger version.
The irony is precise: the law designed to protect vulnerable users will, in its primary effect, strip protection from the users who are most vulnerable.
Why This Matters Beyond Messaging
The Online Safety Act is not unique. The European Union's Chat Control proposal has been cycling through the Council for several years. Australia passed an analogous framework in 2023. Canada is considering similar legislation. The United States has multiple bills, notably the EARN IT Act, attempting to achieve comparable outcomes through different mechanisms. The international regulatory trend is toward a model in which end-to-end encryption is permitted in theory but made operationally impossible by scanning requirements layered on top.
The UK is the first Western democracy to actually execute this model at scale on major platforms. How the platforms respond, how the courts rule, and how the political system reacts to any public backlash will establish precedents for every analogous framework being considered elsewhere. If the UK successfully forces client-side scanning on major messengers without significant political cost, the EU framework follows within 18 months. If the platforms execute their withdrawal threats and the political cost becomes visible, the EU framework gets stalled or rewritten.
This is the moment when the question gets answered. The answer is being written this spring in the form of enforcement notices that most people do not know exist.
The Bitcoin and Nostr Context
The same dynamic that produces messaging surveillance produces payment surveillance. The operational template is identical: specify a concern, mandate scanning, define "accredited technology", enforce against platforms. The EU's Transfer of Funds Regulation already mandates identification of non-custodial wallet counterparties for transactions above certain thresholds. The FATF Travel Rule creates analogous requirements at the international level. The infrastructure being built for message content scanning is the same infrastructure that can be repurposed for payment content scanning.
This is why Bitcoin's self-custody properties and Nostr's relay-based architecture matter structurally rather than only ideologically. They are designed in ways that make the equivalent of Section 121 enforcement technically incoherent. You cannot issue a scanning notice to a Bitcoin node operator in the same way you can issue one to WhatsApp. The nodes are not structured as regulated user-to-user services in the legal sense the Online Safety Act requires.
That technical incoherence is not a bug. It is the point. The rails that survive the next decade of communication and payment regulation will be the ones that are technically incompatible with the enforcement model being deployed against conventional platforms.
The UK just demonstrated why. Quietly. Through a notice most people never heard about.
This article represents the personal opinion of the author and is for informational purposes only. It does not constitute financial, investment, or legal advice. Always do your own research. Full disclaimer
Enjoyed this analysis?
Subscribe to get independent Bitcoin, macro, and politics analysis delivered to your feed.
Subscribe via RSS