Britain's vague internet law pushes services to block UK users rather than comply
Regulatory uncertainty forces legitimate platforms to choose between expensive lawyers, preemptive blocking, or closure
A Swedish non-profit that helps open source developers coordinate software projects recently hired lawyers to answer a peculiar question: does British law apply to them? Libera.Chat operates from Sweden, banks in Sweden, and serves a global community of programmers who use Internet Relay Chat to build free software. Yet Britain's Online Safety Act forced them to seek legal advice about whether they're subject to UK regulation.
The answer, after consultation: probably not, but nobody can say for certain.
This uncertainty isn't a bug in the legislation. It's how the law works. The Act requires services with "links to the UK" to implement safety measures, but never defines what constitutes a "significant number" of UK users—the main test for whether those links exist. Ofcom, the regulator, says platforms should simply "explain their judgment" about whether they're in scope. For organisations trying to follow the law, this amounts to: figure it out yourself, hire lawyers if you can afford them, and hope Ofcom agrees with your reasoning.
Libera.Chat chose to share its legal advice publicly, reasoning that other online communities face identical uncertainty. That transparency reveals something striking: when even a Sweden-based non-profit serving software developers needs lawyers to determine if UK law governs them, regulatory vagueness has become the regulation itself.
A test with no threshold
The Online Safety Act sweeps astonishingly wide. Any service letting users share content or interact falls within scope—social media, messaging apps, forums, gaming servers, IRC networks. Ofcom estimates over 100,000 services worldwide must comply.
Yet the law provides no numerical threshold for "significant" UK users. Is it 1,000 users? 10,000? One per cent of your user base? Ten per cent of Britain's population? The legislation doesn't say. Ofcom won't clarify.
Libera.Chat's lawyers suggested "significant" measures against Britain's total population, not a service's user base. Under this reading, a platform with 10,000 users worldwide—3,000 of them British—wouldn't have "significant" UK users relative to 68 million Britons, despite British users comprising 30 per cent of its entire audience. This interpretation should limit the Act to only the largest platforms.
Yet Ofcom estimates 100,000 services are in scope. Either the population-relative interpretation is wrong, or services must assume they're subject to UK law unless they can prove otherwise—a reversal of normal regulatory principles. Instead of regulators demonstrating jurisdiction, platforms must demonstrate its absence.
For organisations without legal departments, this creates impossible choices. Pay lawyers for advice that might not provide certainty. Implement compliance measures that might be unnecessary and certainly are expensive. Or eliminate the question entirely by blocking British users.
When exclusion becomes compliance
Four file-sharing services faced Ofcom investigations over child safety concerns. Their response: geo-block UK internet addresses. Ofcom closed the investigations, satisfied that blocking had "significantly reduced the likelihood that people in the UK will be exposed to any illegal or harmful content."
Consider what this means. The regulator treats denying access to everyone as equivalent to protecting users through safety measures. One approach blocks legitimate users alongside harmful content. The other implements targeted protections whilst maintaining access. By accepting geo-blocking as satisfactory, Ofcom signals that territorial exclusion achieves the law's goals as effectively as actual safety improvements.
The pattern repeats. Gab blocked UK users. Civit.ai blocked UK users. A suicide prevention forum began geo-blocking when enforcement proceedings started. Ofcom expressed satisfaction that the forum is no longer accessible from Britain—apparently unconcerned that people seeking suicide prevention resources now cannot reach it.
This reveals the law's practical operation. Platforms escape British regulation not by making their services safer, but by making themselves unavailable to British people. The legislation creates a perverse incentive: blocking UK users is easier and cheaper than compliance, particularly when the regulator explicitly accepts blocking as a solution.
British users increasingly face barriers to legitimate online communities, especially smaller platforms and specialist services. When Ofcom's own enforcement actions push services toward geo-blocking, the law achieves something close to the opposite of making Britain "the safest place to be online." It makes Britain a place many online services simply refuse to serve.
The death of small communities
London Fixed Gear and Single Speed wasn't a business. It was a community hub where bicycle enthusiasts met partners, supported each other through illness, reunited owners with stolen bikes, and provided mental health support. Members credited it with preventing suicide. Yet the forum shut down rather than face the Online Safety Act's requirements.
The arithmetic was brutal. Legal compliance would cost tens of thousands of pounds. The forum received a few hundred pounds monthly in donations. Age verification alone would cost £2,400 annually—just for the external service, before legal advice, before staff time, before liability insurance. The same regulatory requirements that Meta absorbs without noticing destroyed a volunteer-run community space.
Microcosm, hosting nearly 300 community forums for 275,000 users, announced similar closures. The operator couldn't accept personal criminal liability for failing to comply with Ofcom enforcement notices—a risk the Act imposes on senior managers. Thousands of people lost community spaces serving as support networks, hobby groups, and mutual aid organisations.
The Act includes proportionality provisions. Smaller services supposedly face lighter obligations than large platforms. But "lighter" doesn't mean "affordable." Even reduced compliance requirements impose costs beyond volunteer communities' means. A forum getting hundreds in donations monthly cannot afford thousands in annual compliance costs, no matter how proportional those costs seem relative to Meta's burden.
The consolidation seems almost designed. Problems motivating the Online Safety Act—harmful content, inadequate moderation, user exploitation—stem primarily from large commercial platforms, not bicycle forums or open source IRC networks. Yet uniform regulatory requirements hit small communities hardest whilst large platforms continue operating with compliance costs that barely register against revenues of billions.
Claiming the world
Britain's approach asserts authority over any service accessible to British users. This creates friction. Libera.Chat—Swedish organisation, Swedish legal status, Swedish banking—must consider whether British regulations apply because British developers use IRC to coordinate software projects.
The extra-territorial reach affects service design. Platforms might avoid features interpreted as targeting UK users. They might implement systems to track and limit British audiences. Some conclude any UK users create unacceptable regulatory risk. A British law discouraging services from serving British people.
Jersey, a Crown dependency, declined to enforce the Act in its territory, citing "inadequacies" in the legislation. Even local authorities within British jurisdiction question the law's workability. Wikipedia challenged the Act in court, arguing compliance would compromise its open editing model and invite state censorship. The foundation lost, but the challenge itself signals how far the Act's reach extends.
Individual countries can regulate global platforms through extra-territorial legislation, but early evidence suggests this approach creates fragmentation rather than safety. Services respond by geo-blocking entire countries. The internet balkanises, access determined by physical location rather than technical capability.
Unintended casualties
Open source software development relies on IRC and similar platforms for project coordination. When these services face British regulatory uncertainty or choose to block UK users, British developers need workarounds. Libera.Chat noted this explicitly: UK individuals shouldn't require censorship-defeating proxies just to engage with free software communities.
VPN downloads surged 1,800 per cent when age verification requirements took effect. Users, including minors the law aims to protect, learned to circumvent geo-restrictions. Many free VPN services harvest user data or provide false anonymity. By pushing people toward these tools, the legislation exposes users to different harms.
The pattern shows regulatory vagueness creating worse outcomes than targeted regulation. Services face impossible choices. Legitimate platforms block UK users. Community spaces close. Developers need workarounds for project infrastructure. None of these outcomes advance protecting users from harm.
Libera.Chat's transparency—publishing legal advice about regulatory uncertainty—provides a model. But it also highlights the problem: even professional legal counsel cannot provide definitive answers about whether British law applies. This uncertainty itself functions as regulation, pushing services toward the safest legal option: unavailability to British people.
The law's architects presumably didn't intend to incentivise geo-blocking or force forum closures. Yet these consequences flow directly from the legislation's structure. Vague definitions, undefined thresholds, and regulatory acceptance of territorial exclusion combine to make rational business decisions produce perverse outcomes. The safest legal path increasingly means denying access to British users rather than attempting compliance with rules nobody can definitively explain.
The Online Safety Act aimed to make Britain the safest place online. Instead, it's making Britain the place online services increasingly choose to avoid.