The Anti-AI Backlash Is Making Things Worse

Someone holding a cellphone showing video
Fear of AI is turning suspicion into fact. It’s making it harder to tell what’s real while undermining genuine criticism.

Is the anti-AI backlash making it harder to tell what is real? Because right now, saying something “looks like AI” is being treated as evidence, and that has consequences.

The backlash exists for real reasons. People are worried about losing their jobs. There’s genuine fear that as AI-generated videos and pictures look more realistic, they can be used to spread misinformation

None of that is imaginary but when accusations are made without any proof, the backlash starts doing more harm than good.

The Blue Prince situation shows exactly how that happens.

The Blue Prince Accusations 

After Blue Prince won Game of the Year at the Indie Game Awards, The Escapist published an article stating the puzzle game was made with AI. The Escapist never shared any evidence to prove their claims and the original article has been edited. The game’s publisher Raw Fury released a statement denying the developer used generative AI to make Blue Prince.

Despite that, the accusations haven’t gone away.

It doesn’t help that the Escapist posted their article after the Indie Game Awards took back the GOTY award from Clair Obscur: Expedition 33 after its developer confirmed the use of gen AI. Anger over that decision was fresh. Blue Prince was caught in the crossfire as a result. 

How Lack of Evidence Undermine Real Criticism

When people accuse creators of using AI without evidence and are later shown to be wrong, the anti-AI movement loses some credibility. It makes it easy to dismiss any criticism towards AI as overblown or reactionary. It turns legitimate concerns into something that looks hysterical from the outside.

It also takes attention away from real harms like deepfakes or scams. These are serious issues that need to be addressed. Instead, energy is spent focusing on the wrong things.

Over time, this creates a chilling effect. Creators learn that transparency is risky. Even limited use of AI tools can trigger criticism. So they hide it, which makes it impossible to ensure AI is used responsibly

Why This Makes It Harder to Tell What’s Real

Generative AI is now good enough that most people struggle to tell the difference between human and AI-generated content. That includes text, art, and even music. AI detection tools are prone to error and can even be biased.

At the same time, suspicion has become the default reaction. If something feels a little too polished, or slightly off, people assume it’s “AI” first and ask questions later. Sometimes they never ask at all.

These accusations are usually powered by emotions. They tap into fears about instability or a decline in quality content. 

Once a narrative is established, it feeds on itself. Echo chambers reinforce those assumptions while algorithms amplify it. Now that it’s getting harder to judge if something is real, doubt lingers even after we get a confirmation.

What a Healthier Response Looks Like

If the goal is accountability, producing evidence should be the standard.

“This used AI” should be treated as an assumption until it’s proven to be a fact. That includes statements from creators, credible reporting, or clear documentation. 

It would be better if the anti-AI crowd advocated for better regulation. To encourage businesses to embrace an environment where human workers use AI to make their jobs easier. 

Criticism should focus on how AI can be harmful and leave it at that. Treating any form of use as a moral failure is not the way.

When accusations are wrong, they should be corrected as soon as possible. Without retractions, legitimate AI criticism loses its credibility.

People need to understand that saying that something “looks like” it could be made with AI proves nothing. Demanding that nobody use AI under any circumstances is unrealistic. If the anti-AI backlash continues to push accusations without any facts, it’ll just make things worse.

You May Also Like