Canonry
← Back to Blog

Why I Built an AI Worldbuilding Tool (And Abandoned the Project I Actually Meant to Build)

By Ryan Dean6 min read

I wasn't supposed to build Canonry.

I was working on something else entirely. A different project, one that needed a rich fictional world as a test bed. Not a sketch of a world — a deep one. Consistent history, believable cultures, interconnected characters. The kind of setting where you could pull on any thread and find it connected to three others.

So I went looking for tools to help me build one. And that's where the trouble started.

The tooling problem

There's actually a lot of great worldbuilding software out there. World Anvil, Campfire, LegendKeeper, Notebook.ai — genuinely useful tools built by people who care about the craft. Wiki-style organization, relationship maps, timeline builders. If you're the kind of person who wants to catalog everything (and if you're a worldbuilder, you probably are), there's no shortage of options.

But here's the thing. It's 2026. We have AI that can write poetry, debug code, and hold a conversation about the geopolitical implications of fictional trade routes. And most worldbuilding tools still feel... dumb? Not in a pejorative way — they're well-built. They're just static. They're filing cabinets. Really nice filing cabinets, with color-coded tabs and cross-references, but filing cabinets nonetheless.

They store your ideas. They don't have opinions about them.

The AI problem

Ok, so what about the AI-native tools? Just throw your world at ChatGPT and let it rip?

I tried that. Multiple times. With increasingly elaborate prompts. And what I got back was... I mean, let's be honest — it was garbage. Not in the sense that it was technically wrong (though sometimes it was). It was garbage because it was generic. Fantasy slop. The kind of output that could describe literally any world. Elven forests with ancient wisdom. Dark lords with shadowy armies. Mystical artifacts of untold power.

You know the voice. You've read it a hundred times. It's that uncanny valley of creativity where everything is technically coherent but nothing feels yours.

The problem isn't that AI can't write. It clearly can. The problem is that a general-purpose AI has no concept of what makes your world different from every other world it's been trained on. It doesn't know that your elves are actually fungal colonies, or that your "dark lord" is a grief-stricken bureaucrat, or that magic in your setting is an industrial byproduct and everyone hates it. It defaults to consensus fantasy because that's what it's seen the most of.

And consensus is the enemy of good worldbuilding.

The tracking problem

Even setting AI aside, I'd been frustrated with my own process for years. I've built a lot of worlds — for campaigns, for fiction projects, for fun. And the thing that always gets me isn't the creation. It's the maintenance.

You build a religion with three denominations. Six months later, you can't remember whether the reformist sect split before or after the plague. You write a character with a grudge against a specific noble house, and two sessions later you've accidentally made that house her ally because you forgot the grudge existed. You name a river, and then name it something different in a document you wrote three months later, and now you have two rivers.

This isn't a failure of imagination. It's a failure of tooling. Human memory isn't built to hold the kind of detail that makes worlds feel alive. And the tools I was using — Google Docs, Notion, the occasional wiki — weren't built for this either. They're general-purpose note-taking. They don't understand what a world is.

What I actually wanted

Here's where it gets personal. When I think about the moments in worldbuilding that actually excite me — the moments where I sit back and think "oh, that's good" — they're never about the big lore reveals. They're about connections.

A blacksmith in a frontier town who hates the local garrison. Not because of some grand ideological conflict, but because her grandfather was conscripted and never came back, and every time she shoes their horses she's helping the institution that took him. That's the stuff. A character with a reason to exist that connects to the world's history in a way that feels inevitable, even though you just made it up ten minutes ago.

When I'm running a game and a player asks "why does this NPC care about this?" — and I have a real answer that links back to something three sessions ago — that's the high. That's what worldbuilding is for. Not encyclopedias. Not wikis. Not lore dumps. Connections that make a place feel like people actually live there.

I wanted a tool that could think that way. Not just store facts, but understand how facts relate to each other. Not just let me write a character, but ask me "how does this character feel about the faction you created last week?" Not just record history, but notice when my history contradicts itself. And, then, with that knowledge I can decide - do I want these conflicting histories? Are these different viewpoints, propaganda, or lost lore. Or, are they just evidence of my aging memory?

The prompt engineering detour

So, like any reasonable person, I spent a few weeks building increasingly unhinged prompt chains in Claude Code.

I do AI work professionally. I've spent more time than I'd like to admit learning what large language models are actually good at, and — maybe more importantly — what they're terrible at. And the thing I kept coming back to is this: with big ideas, AI is genuinely great at collaboration. Not generation. Collaboration. They can be great at generation as well, but they need a plan to follow, the models need guidance and guardrails.

There's a difference. Generation is "write me a creation myth." You get back something competent and soulless. Collaboration is "I have a monotheistic world where the god is fracturing into splinter personalities. What would the first schism look like from the perspective of a priest who doesn't know what's happening?" That's a conversation. The AI has something to work with. It can push back, suggest angles you hadn't considered, ask clarifying questions that force you to think harder.

The AI doesn't need to be creative. It needs to be curious about your creativity.

I started building structured prompts with memory systems and tool access. Specialists, basically. One prompt that thought like a historian. Another that approached things like an ecologist. A third that focused on cultural anthropology. Each one had its own frame of reference, its own set of questions it would naturally ask, its own way of pushing back on ideas that felt underdeveloped.

The creative control stayed with me. I decided what was canon. But instead of staring at a blank page and trying to invent everything from scratch, I had collaborators. Opinionated ones. The historian would point out that my timeline didn't account for how long it takes a civilization to develop metallurgy. The ecologist would ask what the people in my coastal cities were eating. The mythologist would notice that my two rival religions had suspiciously similar origin stories and ask if that was intentional.

It was. But I hadn't realized why until she asked.

The pivot

I was supposed to be testing a different project. Instead, I'd spent three weeks building a world, and the process was the most fun I'd had creating anything in years.

That's not hyperbole. I've been building worlds since I was a teenager. D&D campaigns, homebrew settings, unfinished novels — the whole catalog. And this was different. Not because the output was better (though it was). Because the process felt like having collaborators for the first time. People who cared about my world specifically, who had expertise I didn't, who would push back when I was being lazy and get excited when I hit something good.

So I stopped working on the other project. Didn't agonize about it. Didn't write a business plan. Just... started building the tool. Because if this process was this good for me, it would be good for other people who care about their worlds the same way.

That's the whole origin story. No market research. No competitive analysis. Just a person who got distracted by a better idea.

What it became

Canonry is what those prompt chains turned into. The core idea survived intact: domain specialists instead of one general-purpose chatbot. A historian, an ecologist, a mythologist, a cultural anthropologist — each one focused on a specific dimension of worldbuilding, each one maintaining context about your world across conversations.

Nothing becomes canon without your approval. The specialists suggest, question, pressure-test, and occasionally argue with each other. But you decide what's real. Your world, your vision, your final say.

It's not a wiki. It's not a chatbot. It's closer to having a writers' room, except everyone in the room has actually read your previous work and remembers what you said about the river. Here's how it works.

The other project

It's still sitting there, by the way. Waiting patiently. I'll get back to it eventually.

Probably.


If any of this resonated — if you've ever lost track of a river name or wished your worldbuilding tools could push back on a half-baked idea — Canonry is free to try.

Share
Ryan Dean

Ryan Dean

Worldbuilder, engineer, and the person behind Canonry. Writing about the craft of making fictional places feel real.

Why I Built an AI Worldbuilding Tool (And Abandoned the Project I Actually Meant to Build) | Canonry Blog