Wikipedia’s Nonprofit Host Brings Legal Challenge to New Online Safety Act (OSA) Regulations
Written by the Wikimedia Foundation’s: Phil Bradley-Schmieg, Lead Counsel
On 8 May, 2025, the Wikimedia Foundation, the nonprofit that hosts Wikipedia, announced that it is challenging the lawfulness of the UK’s Online Safety Act (OSA)’s . We are arguing that they place Wikipedia and its users at unacceptable risk of being subjected to the OSA’s toughest “Category 1” duties, which were originally designed to target some of the UK’s riskiest websites.
The Wikimedia Foundation shares the UK government’s commitment to promoting online environments where everyone can safely participate. However, if enforced on Wikipedia, Category 1 duties would undermine the privacy and safety of Wikipedia volunteer users, expose the encyclopedia to manipulation and vandalism, and divert essential resources from protecting and improving Wikipedia and the other .
After years of dialogue with UK regulators and policymakers — and even and — these issues remain unaddressed. The categorization rules are now in force, and the first categorization decisions from the UK’s online safety regulator, , are expected this summer. The solutions we proposed — including clarifying the new rules — were not accepted. As ministers :
We felt we needed to get on with it and put these measures into place […] None of these issues are off the table, but we just wanted to get the Act rolled out in as quick and as current a form as we could. […] I recognise that the Act is imperfect.
With time running short, we have initiated a legal challenge to the Categorisation Regulations. We are taking action now to protect Wikipedia’s volunteer users, as well as the global accessibility and integrity of free knowledge.
The Wikimedia Foundation is not bringing a general challenge to the OSA as a whole, not even to the existence of the Category 1 duties themselves. Our legal challenge will solely focus on broad new categorization rules that risk imposing Category 1 duties on Wikipedia, either this year or the next, 2026.
The threat to Wikipedia
There are many OSA Category 1 duties. Each one could impact Wikipedia in different ways, ranging from extraordinary operational burdens to serious human rights risks. Wikipedia has thousands of volunteer users based in the UK alone, and content from cultural institutions like the British Library and Wellcome Collection. But the law’s impact would extend far beyond the UK.
The Category 1 “user verification and filtering” duties are a good example. to ensure it is neutral, fact-based, and well sourced. Sophisticated volunteer communities, working in over 300 languages, collectively govern almost every aspect of day to day life on Wikipedia. Their ability to set and enforce policies, and to review, improve or remove what other volunteers post, is central to Wikipedia’s success, notably in resisting vandalism, abuse, and misinformation. For example, at the same time as misinformation and race-baiting .
However, if Wikipedia is designated as Category 1, the Wikimedia Foundation will need to . That rule does not itself force every user to undergo verification — but under a linked rule (), the Foundation would also need to allow other (potentially malicious) users to block all unverified users from fixing or removing any content they post. This could mean significant amounts of vandalism, disinformation or abuse going unchecked on Wikipedia, unless volunteers of all ages, all over the world, undergo identity verification.
Although the UK government felt this Category 1 duty (which is just one of many) would usefully , Wikipedia is not like social media. Wikipedia relies on empowered volunteer users working together to decide what appears on the website. This new duty would be exceptionally burdensome (especially for users with no easy access to digital ID). Worse still, it could expose users to data breaches, stalking, vexatious lawsuits or even imprisonment by authoritarian regimes. Privacy is central to how we keep users safe and empowered. Designed for social media, this is just one of several Category 1 duties that could seriously harm Wikipedia.
Who falls under Category 1?
Parliament originally designed the Category 1 duties for services presenting the greatest risks to society. It then asked the Secretary of State for Science, Innovation and Technology to precisely define which services would be caught. In February 2025, under , the government issued the Categorisation Regulations, closely tracking .
To avoid any risk of loopholes, and due to limited research, the were left especially broad and vague. They have no real connection to actual safety concerns. They were designed around three flawed concepts:
- Definition of content recommender systems: Having any “algorithm” on the site that “affects” what content someone might “encounter”, is seemingly enough to qualify popular websites for Category 1. As written, this could even cover tools that are used to combat harmful content. We, and many other stakeholders, have failed to convince UK rulemakers to clarify that features that help keep services free of bad content — like the used by Wikipedia article reviewers—should not trigger Category 1 status. Other rarely-used features, like Wikipedia’s , are also at risk.
- Content forwarding or sharing functionality: If a popular app or website also has content “forwarding or sharing” features, its chances of ending up in Category 1 are dramatically increased. The Regulations fail to define what they mean by “forwarding or sharing functionality”: features on Wikipedia (like the one allowing users to choose Wikipedia’s daily “Featured Picture”) could be caught.
- Platform popularity: Finally, in assessing popularity, the Regulations seemingly do not differentiate between users who visit the site just once a month, however briefly—for example, just to look up a date of birth on Wikipedia—versus those who spend hours each day “doomscrolling” potentially harmful content on social media. All that matters is whether a website or app has several million UK visitors a month, total. shows enormous differences in how educational services, like Wikipedia, are actually used in practice.
As a result, there is now a significant risk that Wikipedia will be included in Category 1, either this year or from 2026 onwards. The Regulations do not just risk overregulating low risk “outlier” services, like Wikipedia and navigation/mapping apps. As designed, the regulations will also fail to catch many of the services UK society is actually concerned about, like misogynistic hate websites. This is , and led to an outcry amongst ,, and even .
Litigation was avoidable—and nothing prevents its quick resolution
Fixing the scope of the OSA itself, or at least just the scope of Category 1, would avoid human rights risks, minimize red tape, and keep regulatory enforcement focused squarely on where it is most needed. Yet from academics and civil society, years of engagement with rulemakers, and even , other interests are still being prioritized. Eighteen months following the OSA’s entry into force, the
We do not dispute the need for sensible online regulation. Wikimedia’s free knowledge projects are safe and important resources through which people across the UK—and the wider world—learn, share knowledge, collaborate, and gain media literacy, in over 300 languages.
But for services like Wikipedia to thrive, it is essential that new laws do not endanger charities and public interest projects.
Thankfully, new laws, , are usually written flexibly. Others, for example in Australia, France, and Germany, can be more specific, but then avoid collateral damage by exempting nonprofit organizations or educational projects.
That was also the intention for the OSA. Its , however, led to what has been a “Frankenstein” law. It is around 300 pages long, and imposes more duties than any other online safety law of which we are aware. It has even accumulated over 1500 pages of supplemental Ofcom guidance and Codes of Practice—with more yet to come.
To avoid the extra Category 1 duties being applied too broadly, Parliament therefore specifically gave ministers the ability to tailor the scope of Category 1 according to (which, we argue, should at least have considered charitable status), and also gave government ministers (e.g., in ) and Ofcom (e.g., in ) other important burden-reduction powers. This gives the government and Ofcom all they need if they want to improve how the law treats low-risk, socially-beneficial services—and their users.
What happens next
The Wikimedia Foundation must act now to ensure Wikipedia is protected for the future. We are asking for expedited hearings to ensure Wikipedia is protected for the future. Time is short: Under UK rules for judicial review, the Categorisation Regulations must be challenged rapidly. Ofcom is already demanding that we provide the information it needs to make a preliminary Category 1 assessment for Wikipedia. It is in the interest of UK society for laws that threaten human rights to be challenged as early as possible.
Meanwhile, Ofcom is expected to make its first decisions about Category 1 status very soon. Given how broadly-worded the Categorisation Regulations are, with the Department for Science, Innovation and Technology and Ofcom steadfastly refusing to offer additional guidance, we cannot predict what Ofcom will do, now or in future years. Category 1 will remain a risk for Wikipedia for as long as it can include popular websites simply because they implement content “forwarding or sharing” features, and/or deploy “algorithms” that somehow “affect” what content users might encounter.
We regret that circumstances have forced us to seek judicial review of the OSA’s Categorisation Regulations. Given that the OSA intends to make the UK a safer place to be online, it is particularly unfortunate that we must now defend the privacy and safety of Wikipedia’s volunteer editors from flawed legislation.
Stay informed on internet policy and Wikipedia’s future: ! 📩