Epic Games is rolling out a new “conversations” tool for Fortnite creators that lets them build AI-powered NPCs capable of unscripted dialogue. Yes, the same company whose AI Darth Vader decided to swear last year is now giving thousands of developers the keys to create talking characters powered by Google’s Gemini models and ElevenLabs voice synthesis.
I find this fascinating for all the wrong reasons. Epic is essentially saying “here’s a generative AI system that can say almost anything, now please don’t make it weird.” And to their credit, they’re at least trying to get ahead of the obvious problems.
The tool itself is straightforward. Instead of building traditional dialogue trees, you write prompts that define the character’s personality, knowledge, and behavior. The system uses Gemini 3.1 Flash-Lite for processing audio and generating text, then ElevenLabs converts those responses into voice. It’s a pretty standard LLM pipeline wrapped in game engine tooling.
The Rules That Tell You Everything
What really caught my attention are Epic’s explicit restrictions. Three rules stand out, and they’re incredibly revealing about what the company expects developers to try:
No medical or mental health guidance. No romantic partners or intimate companions. No attempts to bypass safety systems.
That second rule is doing a lot of heavy lifting. Epic saw the future where teenage Fortnite players would inevitably try to romance AI quest givers and said “absolutely not.” It’s the kind of policy you only write after seeing what happens in other platforms or running internal testing that made everyone uncomfortable.
The circumventing safety systems rule is equally telling. They know developers will try to jailbreak these NPCs. It’s not paranoia, it’s pattern recognition. Every content moderation system ever built gets tested by users trying to break it, and Epic is just acknowledging that reality upfront.
The Experimental Label Does Real Work
Here’s the important part: this is marked “experimental” and developers can’t publish anything until it hits beta. There’s no timeline for when that might happen. Epic spokesperson Jake Jones was careful to say they have “no timeline to share.”
That’s smart. They’re essentially running a contained beta test with their creator community before opening the floodgates. Every weird edge case, every unexpected interaction pattern, every creative attempt at bypassing restrictions gets logged and analyzed before regular players see any of this.
I suspect Epic learned something from the Darth Vader incident. When your AI character starts swearing in a recreation of James Earl Jones’ voice, that’s not just a technical failure. It’s a brand problem, a PR nightmare, and possibly a legal liability all rolled into one.
Platform Ambitions Meet Reality
This tool is part of Epic’s broader push to transform Fortnite from a game into a platform for creator-made experiences. They’ve been investing heavily in this vision, adding official Star Wars assets and other tools to enable complex custom games.
But the timing is awkward. Epic just announced layoffs affecting over 1,000 workers, citing a “downturn in Fortnite engagement” that started in 2025. So they’re simultaneously trying to expand the platform’s capabilities while dealing with declining user numbers.
The conversations tool could be genuinely useful for creators who want to build more dynamic narrative experiences. Imagine quest givers that can actually respond to player questions naturally, or tutorial characters that adapt their explanations based on what the player seems confused about.
But it could also be a minefield. Every AI-powered NPC is a potential PR incident waiting to happen. Every conversation is a chance for the model to say something inappropriate, offensive, or just deeply weird. And unlike a scripted dialogue tree where you can review every possible branch, these systems are probabilistic black boxes.
Epic is betting they can thread this needle: give creators powerful tools while maintaining enough control to avoid disasters. Whether that’s actually possible at scale with generative AI remains an open question that no one in the industry has definitively answered yet.