There is a lot of fear around generative AI right now. Most of it comes from wanting to protect humans from losing their jobs.
People look at AI and see a system trained on massive datasets, often scraped from real artists without their permission. They feel like the output is a knockoff of someone else’s voice or visual style.
That fear and overall disgust towards AI has leaked into gaming. Players worry that studios will use AI as a shortcut that will cause the quality of games to drop.
Ubisoft is aware of the criticism since they’ve been integrating AI into how the company operates. Now they are ready to present their new research project called Teammates.
It’s not an actual video game. It’s more like a controlled experiment to demonstrate how AI could support developers and players. All without replacing the people who actually make the game.
What Teammates actually is
Teammates is an AI-driven research project built to make NPCs act like real allies during gameplay.
Ubisoft created a small, self-contained prototype set in a dystopian world where the player is part of a resistance team. Your goal is to infiltrate an enemy base and find five missing operatives. Two AI-driven squadmates named Pablo and Sofia respond to both the situation and your voice.
You recover memories from fallen teammates, push through enemy squads. You also get support from an in-game AI voice assistant named Jaspar. Jaspar calls out threats, provides lore, adjusts settings, navigates menus. It can pause the game through natural speech. You don’t need to speak in a specific pattern or follow a set of pre-written commands. The system listens, interprets tone, intent, and reacts in real time.
The project blends Ubisoft’s own technology with large language models. The goal isn’t just to parse commands but to understand context. It lets NPCs react to your playstyle and speech instead of sticking to canned lines found in most games.
How Teammates uses generative AI inside gameplay
The interesting part is how much the AI is allowed to do and where the limits are drawn.
Ubisoft mixes traditional behavior trees with language models so NPCs can read the room. If you tell Pablo to cover you, solve a puzzle, or check a hallway, you can phrase it naturally. The AI interprets the meaning instead of waiting for a very specific command. It can pick up sarcasm, casual slang, or signs of frustration.
The point is to make NPCs feel like they are paying attention.
The system also lets characters improvise, but only inside the boundaries that human writers create. Developers define each character’s personality, goals, emotional range, and lore. That prevents characters from going off model while still giving them more flexibility than pre-written scripts.
This is also why Ubisoft built an internal API to filter output. It’s meant to stop hallucinations, bias, or harmful dialogue before it reaches the player. They are trying to scale the tech without letting it become unpredictable.
All of this runs inside Ubisoft’s Snowdrop engine, supported by Google Gemini’s model. The result is an NPC that can adjust to what is happening, and make choices based on the actual situation.
The goal is not to replace developers
Ubisoft is fully aware that people see AI and assume it means fewer writers or fewer designers. That is not what Teammates is trying to do.
The core story direction, character creation, and emotional beats are still written by humans. Designers still build the environments, choose the mission flow, and define the rules of the world. The AI is only there to interpret your commands.
Ubisoft says that Teammates is a testbed. The end goal is to create tools that developers can use to build richer NPCs.
They compare this experiment to the moment gaming moved from 2D to 3D.
What this could mean for the future of gaming
AI in gaming has been treated like a threat for so long that nobody stops to ask whether there are ways to use it responsibly. Teammates is Ubisoft’s answer to that question. It shows a version of AI that supports the story, the world, and the player. All while humans still write the voices and control the direction.
This approach will not erase the concerns people have around AI. It does not magically solve the questions about training data or authorship. But it does reframe the conversation. Instead of asking whether AI is good or evil, it pushes us to ask how the tool is being used and who is steering it.
If gaming wants to grow, it needs more than fear. It needs experiments like this that are willing to explore what AI can do without stripping out human creativity.