Skip to content

Transformers.js In Web Games

Posted on:April 30, 2026

I’ve been building a small roguelite autobattler for the Three.js Journey challenge and community. The core loop is pretty simple and familar if you play these kinds of games. You create a conductor, send a train across an overworld map, pick routes, collect resources, add train cars, and resolve battles through autobattle systems.

Overworld map of my browser-based roguelite autobattler.

One feature I wanted to experiment with was local AI generation in the browser, not as the core mechanic, but as a flavor layer. That became the Tome of Wonder.

The Tome of Wonder is an opt-in feature that uses Transformers.js to generate conductor backstories and personalize overworld events based on that character. During conductor creation, the player can click a dice icon to download the Tome locally. Once it is ready, the game can generate a short backstory for that conductor. Later, when the player hits event nodes on the overworld map, the Tome can use that conductor’s race, description, tags, and backstory to rewrite the event text.

The Tome is an explicit download, at first I called it “Local Model” and then realized that sort of takes you out of the game world and few people know what that means.
Once it is downloaded, the Tome can draft a local backstory during conductor creation.

The first version used a smaller Gemma model while I proved out the browser flow. The Tome now uses Gemma 4 E2B, with the ONNX model published for Transformers.js:

onnx-community/gemma-4-E2B-it-ONNX

This model takes longer to download than the gemma3 model I used at first, but the tradeoff is worthwhile for richer backstory generation. Since the Tome is optional and cached in the browser after download, the player can choose when to pay that first-use cost.

The current implementation uses @huggingface/transformers in the browser with WebGPU and q4 quantization, which means using a compressed 4-bit version of the model so it takes less memory and downloads faster in the browser. The goal is not to have the model own the game design. Rewards, choice IDs, and balance all stay authored and deterministic. The model only gets to personalize the narrative wrapper: event titles, summaries, choice labels, and outcome text. That design choice matters a lot for games. If the LLM fails, returns weird JSON, or the browser does not support the feature, the authored event still works. The player can keep playing. No server call, no account, no hard dependency.

A few implementation details made the feature feel better:

Can I use: WebGPU

WebGPU support is getting much better, but it still varies enough that you should check compatibility and design graceful fallbacks. The game runs a browser check first and, when support is available but not enabled, shows browser-specific tooltip guidance for turning it on.

Chrome guidance for checking GPU support and enabling the WebGPU flag when needed.
Firefox guidance for enabling WebGPU in about:config on setups where it is still disabled.

The thing I like about this pattern is that it treats local AI like a game feel layer. It does not replace handcrafted systems. It adds a little personality, lets the run feel more specific to the character, and gives web game devs a practical way to experiment with Transformers.js without betting the whole game on generation working perfectly on every device.

I hope to see more web games experiment with Transformers.js. If this inspires you to build something, please share it. I’d love to see it.

Want to try out my game and the Tome of Wonder? You can play it here: https://learning-train.vercel.app/.

Keep in mind, this is just a hackathon project and full of bugs and unbalanced systems, so be prepared, brave conductors 🤓!