What the Online Safety Act Tries to Do
The UK’s Online Safety Act is one of the broadest attempts yet to regulate the internet. It’s designed to crack down on illegal material as well as what lawmakers call “legal but harmful” content, with a heavy focus on protecting children. That means platforms now have to verify user ages, moderate conversations, and prove they’re keeping unsafe material out of reach.
The rules are strict. Game developers must add systems to check ages before unlocking features like chat, voice, or user-generated content. Platforms like Steam, Xbox, and PlayStation are already changing their services to comply. The penalties for noncompliance are severe. Fines can reach up to 10% of global revenue. That threat alone has pushed developers to cut features, delay launches, or even block UK users entirely.
On paper, the law sounds like a step toward safety. In practice, it’s putting enormous pressure on developers and platforms, especially small ones that lack the resources to build expensive compliance systems.
Where Responsibility Really Belongs
Here’s where I struggle. I understand the desire to protect children. Nobody wants kids stumbling across the worst parts of the internet. The internet is, and has always been, an adult space that happens to contain children’s things. Not the other way around.
At the most basic level, access to the internet is controlled by adults. An internet account requires a credit card or bank account. A minor only has access because a parent or a school gave it to them. Yet under the Online Safety Act, it’s adults who now have to prove they are adults. That flips the responsibility upside down.
If the true goal is to limit minors’ exposure, why not place the legal burden directly on parents? Fine parents if their child is caught using the internet without supervision. Require devices to ship with parental controls already enabled. Hold adults accountable for granting access, because they are the ones paying for it.
The Problem No One Wants to Talk About
The reason lawmakers don’t go this route is obvious: many parents aren’t tech savvy. Kids often know more about technology than their parents, and that makes enforcement messy. Instead of addressing that problem directly, governments are outsourcing parental responsibility to platforms.
Look at games like Grand Theft Auto V. It’s rated M for Mature, clearly intended for players 17 and older. If a parent buys it for their 13-year-old, is that a failure of the internet, the game industry, or the parent? I would argue it’s the parent. They made that choice.
Yet, the Online Safety Act treats this as a problem for developers and platforms to solve. The end result is a weaker internet for everyone. Social features are stripped out, indie developers are priced out, and adults without children are forced into pointless verification hoops.
The Wikipedia Example
This isn’t just about games. Take Wikipedia. The Wikimedia Foundation actually sued to stop Wikipedia from being classified as a “Category 1” service, which would have required identity verification. The UK High Court dismissed the lawsuit, but it also warned regulators not to disrupt how Wikipedia operates. Wikimedia chose not to appeal but is still fighting for privacy and editorial freedom.
Why should Wikipedia, or any platform, be forced to police whether a user is 15 or 50? If a parent doesn’t want their child using the site, the solution is simple: block it at home. That’s the parent’s responsibility, not Wikipedia’s.
The Bigger Risk: A Narrower Internet
The Online Safety Act is already inspiring copycat legislation in the EU, Australia, and the US. The global ripple effect could be enormous. Developers are retreating from open communication features. Indie studios, unable to afford compliance, may disappear. Larger platforms with built-in moderation tools will consolidate power, making it harder for new players to compete.
The irony is that a law meant to make the internet safer may instead make it smaller, less creative, and more monopolized.
Protecting children is important, but the Online Safety Act puts the responsibility in the wrong place. Parents control the accounts, the devices, and the household rules. They should also control access. Instead, platforms and developers are carrying the weight, and the cost, of decisions parents make every day.
If we keep heading down this path, the internet risks becoming a censored, corporatized shell of itself. The better alternative is clear: put responsibility back where it belongs, on parents, not platforms.