Something is changing in gaming and not everyone is happy about it.
A lot of people are starting to question whether the things they consume are made by a human or an AI. The games you play, the drawings flooding your feed, or the songs you listen to could be AI-generated and it’s getting harder to tell the difference.
This matters more than people realize.
Five Hours of Frustration
Crime Scene Cleaner is a relaxing simulation game where you play as Kovalsky, a crime scene cleaner. You mop floors, scrub walls. You deal with the aftermath of some grizzly crimes.
In late March 2026, the developers at President Studio announced an upcoming DLC and released Act 2 as a free update. The update added five new missions that wraps up Kovalsky’s story.
Then the bugs popped up.
Some areas looked dirty when they were perfectly fine. Others that actually needed cleaning didn’t register at all.
While working on the mission Secluded Retreat, I had an area that said there was missing trash. I looked for the longest time and eventually ended the mission. The game marked it as being completed and gave me the achievement. I spent five hours searching for missing dirt and blood in the next mission Circle of Friends. I thought it was bugged like the last one but it wasn’t. Because of that, I didn’t earn the achievement. Act 2 shouldn’t have been released in this state.
Unfortunately, this is part of a pattern. Many studios push out games or updates before they’re ready. Players are forced to deal with the issues until a patch comes later. Everyone moves on until it happens again.
Then Came the Accusations
On top of the bugs, players started accusing Crime Scene Cleaner of using AI-generated art.
The developers responded with this statement, making it clear they don’t use AI-generated art, dialogue, or voices. Everything is made either by their own artists in-house or purchased from legitimate sources like the Unity Asset Store and stock photo libraries.
They also explained a technique called photobashing: the process of combining and editing real photos to make new images. The method has been around since the 1850s. Digital photobashing has been used in games and films for nearly 30 years. AI isn’t involved, but it can generate results that look similar since most AI tools are trained with human made art.
They even replaced some purchased textures due to concerns from the community. President Studio has been open and transparent throughout the whole scandal, which isn’t common in situations like this.
Sounds like everything has been resolved, right?
Nope!
The Problem No One Wants to Say Out Loud
In their statement, the developers admitted it’s hard to tell the difference between AI-generated and human-made art. Again they DON’T use AI but they acknowledged that something could slip through without them knowing. That they do their best to verify purchased assets, but they can’t be one hundred percent sure it was made by a human.
When you buy assets from a third-party marketplace, you’re expecting a long chain of people to be honest. Let’s say a small studio buys a texture pack. The seller doesn’t disclose it was AI-generated. The studio puts it in their game, believing it’s human-made. Are they lying when they say they don’t use AI?
The Unity Asset Store has thousands of contributors. Steam now requires developers to disclose AI use, but only if they know about it. This only works if every vendor is honest.
This issue is not unique to Crime Scene Cleaner. It’s an industry-wide problem. In 2026, no studio buying outside assets can guarantee they’re 100% AI-free. The current supply chain doesn’t support that level of certainty yet.
The Law Is (Sort Of) Catching Up
A recent court case involving AI companies using books as training data could change how artificial intelligence learns.
Anthropic, the company behind the AI assistant Claude, was sued by a group of authors. They claimed their books were for training data without their permission. In June 2025, a judge named William Alsup ruled that developers are allowed to use legally purchased books to train an AI without getting the author’s consent. The process can be considered “fair use” because it was “exceedingly transformative.” In other words, buying a book and learning from it, even if you are a machine, doesn’t count as stealing.
But…and this is important.
The same ruling found that Anthropic had downloaded over 7 million books from illegal websites without the authors’ knowledge. That part was ruled as copyright infringement. Anthropic eventually agreed to pay $1.5 billion to settle, covering approximately 500,000 works at around $3,000 per book, and agreed to destroy the unlawfully obtained files.
So the court drew a clear line. Buying content to use for training AI is acceptable. It’s only a problem when the content is stolen or pirated.
In response to the legal challenges, Anthropic began buying books in bulk, physically scanning them, and feeding the pages into their training data.
This case highlights a long-standing issue in the development of generative AI. The early days of AI relied on scraped, unverified content that was taken without permission. But that doesn’t mean today’s AI are trained with illegal content. Legal and public pressure is pushing companies to pursue licensing deals to get access to content.
Does that undo the past? No. Some experts note it fits a tech industry pattern of growing a business first. Then pay a relatively small fine. Claims that AI is trained on stolen art are getting old. The reality is much more complicated than it was before. Some of it probably was, much of it was purchased legitimately.
The artists and writers who were never compensated for scraped content in 2021 or 2022 are still hurting. Website owners have to deal with the loss of traffic because AI takes our content.
The fight over creative rights in the age of AI continues. It’s a battle that will shape how digital content is produced and most people have no idea it’s even happening.
The Artists Getting Hurt by Baseless Allegations
When players see something that looks AI-generated and call it out in reviews without evidence, real artists are the ones who pay the price. The developers of Crime Scene Clean addressed this: “The backlash against AI is starting to turn into backlash against actual artists.”
There are already documented cases of human artists being banned from online communities because their hand-made work was mistaken for AI. Imagine spending years developing your craft, only for strangers on the internet to call it “AI slop.”
It’s devastating.
At the same time, the concern driving those accusations are valid. AI art tools were trained on millions of images scraped from the internet, often without the original artists’ knowledge or consent. A major lawsuit included over 4,700 artists whose work was used this way. The people most affected are digital artists, illustrators, concept artists… the same type of people who work on games like Crime Scene Cleaner.
YouTube, Animals, and the Softer Side of This
Not all AI content is controversial.
AI-generated animal videos on YouTube or Instagram, for instance, are just… cute. Many viewers can tell they are AI and they don’t care. Nobody is being accused of anything and no one is getting hurt.
That contrast is interesting. AI only feels threatening when the livelihood of human artists are at stake. When someone’s craft and income depend on art that’s mistaken for AI, the conversion changes completely$.
What This All Adds Up To
Gaming is in the midst of a trust crisis. Buggy launches, unclear asset sourcing, detection tools that are unreliable, and an online culture that jumps to conclusions first and investigates later.
Crime Scene Cleaner’s developers tried to do the right thing by addressing the controversy, crediting voice actors by name and removing questionable assets. They were transparent in a way many studios are not.
But good intentions can only do so much. The culture and legal system haven’t caught up to how games are made in 2026. AI and human art can look similar to each other, which makes it difficult to tell them apart.
Until there are trustworthy verification tools, clear marketplace standards and policies on disclosing the use of AI, studios and players are playing a guessing game.
And the human artists are, as always, caught in the middle.