What’s Actually Happening?
Steam and Itch.io recently removed hundreds of adult games from their storefronts, igniting controversy. The narrative? That payment processors like Mastercard and Visa demanded the removal of NSFW titles.
Mastercard and Visa say that’s not quite true. Both companies clarified they didn’t directly contact Steam or Itch to force any specific takedowns. Instead, they pointed to the existing rules platforms must follow if they want to process transactions through their networks. Those rules haven’t changed. The way companies are reacting to them clearly has.
Steam and Itch didn’t wake up one day with a new moral compass. They’re reacting to risk. That risk is built into how Mastercard and Visa enforce their policies.
The Real Rules Behind the Scenes
Let’s break down what the rules actually say.
Mastercard Rule 5.12.7 requires merchants to block illegal adult material and ensure nothing unlawful can be purchased using their cards. That means:
- Verifying legality of content
- Implementing age checks and consent mechanisms
- Avoiding anything that could damage Mastercard’s brand or legal standing
Visa’s rules are similar. They’ve tightened them in recent years through the Visa Integrity Risk Program (VIRP), which demands stricter oversight of adult content merchants, including things like:
- Advanced Know Your Customer (KYC) processes
- Continuous monitoring
- Pre-approval for adult content
In short: they don’t ban all adult content, but they do set the bar high enough that platforms often err on the side of caution.
Why Was “No Mercy” Removed?
No Mercy wasn’t just adult content. It depicted incest, rape, and sexual violence. Its own developer described the game as centering on male domination and non-consensual sex. That’s not just NSFW, it’s illegal. And that’s exactly the type of content payment processors flag as a violation of their longstanding rules.
Mastercard and Visa don’t need to name a game to make it disappear. The rules themselves make sure anything close to illegal content is seen as radioactive. Once one game is pulled, platforms start combing through others. They preemptively strip out risk to keep access to payment processing intact.
It’s Not About Morals. It’s About Liability
Mastercard says it “follows standards based on the rule of law.” Fair enough. The truth is, reputation and risk shape those standards more than any objective legal review.
Let’s be honest. If this were purely about consistency, we wouldn’t be seeing this selective enforcement. Game of Thrones showed incest and rape on television. Bell Biv Devoe’s “Do Me” is still streaming everywhere, even with lyrics that strongly imply underage sex. OnlyFans exists because Mastercard and Visa allow it, albeit with higher processing fees and stricter safeguards.
So why do some forms of adult content get greenlit, while others get wiped?
Because payment processors don’t want their brand associated with controversy. It’s not about whether content is illegal. It’s about whether it might become a problem.
The Double Standard Is the Point
Let’s not sugarcoat this. Mastercard and Visa do allow adult content, but only when it’s convenient. Platforms like OnlyFans are profitable enough to justify the legal risk and the safeguards. Game devs selling visual novels? Not so much.
Developers have found that games can stay listed if they remove certain keywords or delete adult-only DLCs even if the core game doesn’t change. The enforcement doesn’t seem based on actual content review, but rather on metadata and descriptions.
This kind of vague, keyword-driven censorship is especially harmful to indie creators. Especially LGBTQ+ devs exploring serious topics like trauma or emotional complexity. When a non-sexual game about domestic abuse (like Last Call) gets removed alongside pornographic incest simulators, the system is clearly broken.
What Happens If the Rules Get Broader?
Right now, Mastercard and Visa’s rules narrowly focus on illegal sexual content: rape, child exploitation, incest, and non-consensual acts.
Here’s the question that should worry everyone: What if they expand those rules?
If a bank or processor decides something is too risky to touch, like a politically controversial game, or one depicting real-world violence, what stops them from pulling it? Fictional murder is legal to depict in a game. So is consensual sex. The distinction isn’t moral. It’s about corporate comfort and public scrutiny.
If financial pressure can remove an adult visual novel today, what’s to stop it from silencing a political game tomorrow?
Know the Rules, But Question the Power
If you’re making adult content, you absolutely need to understand the rules from Mastercard and Visa. Follow them, document compliance, and don’t assume platforms will protect you. As Steam and Itch have shown, once payment processors get involved, nobody wants to take the fall.
Also, don’t let these companies off the hook.
They’ve built a system where they don’t need to issue takedown demands. All they have to do is maintain vague policies and let fear do the rest. That’s not accountability. That’s corporate censorship hiding behind legalese.
This is bigger than porn games. It’s about who controls digital commerce and how quietly that control can reshape our media. Today it’s adult games. Tomorrow, who knows?
Understand the rules. Don’t stop asking who gets to write them.