Getting better at Englishing with the help of A.I and CloudFlare
In this post I jot down lessons I learnt from building Wordazzle,a word game that aims to make the player more articulate by expanding their vocabulary with high quality words, delivered in the form of a delightful little guessing game. As a broad overview, the player is shown a list of seven questions, the challenge being that the player has to guess the correct word based on its definition. Here is what it looks like in action: The app has some nifty features, like the ability to save words and test yourself on them later. The question set is refreshed daily so everyday brings with it new challenges, and hopefully, new learnings. The wordset is an array of JSON objects that look like so:
[
{
letter: "E",
question: "What E describes someone who is cheerful and full of energy?",
answer: "EBULLIENT",
example: "Her ebullient personality lit up every room she entered.",
},
{
letter: "F",
question: "What F means intense happiness or bliss?",
answer: "FELICITY",
example: "The felicity in her voice was a joy to hear.",
},
//so on...
];
And here is the rub: All content is generated by GPT4-o! More on this later, for now, let’s start from the beginning.
How I Built Yet Another Unprofitable Side Project…
When embarking on Wordazzle I prioritized three non-negotiable principles:
- Developer Velocity: I wanted to build as quickly as possible.
- Hosting Cost: Should be so cheap but provide so much value that I make a blog-post raving about the service
- Decent code quality: The thought of sharing my git repo shouldn’t make me sweat and/or cause my heart to race
I aimed to go from idea-to-prototype in under a month (Every developer who’s worked on a pet project will know why) The frontend architecture needed to support rich interactivity (timers, input validation, flashcard systems) while remaining lightweight enough to deploy on budget-conscious infrastructure.
Below is a TL;DR of the overall architecture:
Frontend: Next.js + TypeScript, TailwindCSS, and Redux Toolkit.
Backend: A Cloudflare Serverless Worker built with the Hono framework, using Cloudflare Workers KV as the NoSQL database. The app is hosted on Cloudflare Pages.
Next.js: Routing Ergonomics, Please
I chose Next.js exclusively for its file-system routing(yes, shame on me) – a decision born from painful past experiences with react-router’s boilerplate-heavy implementations. The ability to define routes via /app/* directory structures eliminated entire categories of routing bugs, and tears. There were some key trade-offs, however; Next.js 13+’s server-first paradigm clashed with Wordazzle’s client-only needs. While I appreciate server components for content-centric apps, littering my app with use client
directives felt like fighting the framework.
I subsequently learnt about vite
and will be using that as my go-to React framework as I’m unable and uninterested in keeping up with all the API changes that are part and parcel of Next.
Redux ToolKit: Or Why useContext Should Be hardlyUsed
I have built apps with complex state management using React’s native useReducer and useContext APIs. Although it felt clever at first, writing custom migration logic, hooks, and middleware for handling async actions and localStorage persistence turned out to be tiresome and error-prone. As someone who was A little older,a little humbler me chose to go with Redux ToolKit, and I am happier for it. Again, erogonomics.
TailwindCSS: Finally, I Can Make Pretty Layouts!
I originally contemplated using Styled Components so I could write All The CSS, but figured I’d see what the hoopla around TailWindCSS was, and oh boy, was there hoopla to be found. For me, the biggest win was the excellent token design system that came out of the box; it allowed me to rapidly prototype the UI without fidgeting with the usual suspects that developers fidget with, like colors, spacing, font sizes. It also felt really nice having all the styles of my component directly in my component file as opposed to a separate css file (i.e. no context switching), a feature I really valued coming over from Styled Components. On the flip-side, the learning curve can be a little steep, for example, here is all the styling for the site title (located top left on the previous screenshot)
<h1
class="text-2xl sm:text-5xl lg:text-6xl font-extrabold
mb-4 mt-3
bg-clip-text bg-gradient-to-r text-transparent
from-indigo-400 via-pink-600 to-rose-500
uppercase tracking-wider drop-shadow-md"
>
<a href="/">Wordazzle</a>
</h1>
As you can see, some of the class names are quite intuitive, others, not so much. Also, the why is it text-6xl
to make the text large, but font-extrabold
to make the text bold ? I dont get it, but cest la vi. All in all, TailwindCSS is great and I’ll be using it as my go-to CSS framework moving forwards.
Backend Architecture(Or why I love CloudFlare)
Cloudflare Pages: Wordazzle is hosted on CloudFlare Pages, a JAMstack platform that serves your app through Cloudflare’s Edge Network and makes deploying/updating your app as simple as:
- Push code to GitHub
- Watch Cloudflare auto-build and deploy to 300+ edge locations
- Profit 💸
Picture sub-200ms global load times courtesty of pre-rendered and Edge Network caching, Pages delivered in a sublime way as far as speed was concerned, but not so much in terms of analytics; CF doesn’t give you the indepth analytics and the pretty graphs that Vercel does, but it does just enough to deter you from implementing Google Analytics on Day One.
Cloudflare Workers: Serverless Serves Well
The backend is a serverless function by the name of Cloudflare Workers. Workers run on V8 isolates (not containers), so cold starts average ~5ms (yes, you read that correctly). They are extremely easy to work with, and your dev and prod environments are separated by default, meaning you can test away to your heart’s content. If you’re using Workers, it’ll behoove you to use Hono, a delightful little framework makes your code much more clearner and managable through it’s streamlined approach to defining routes and it’s battery of useful middleware functions. Here is the entirity of my Worker’s index.ts
file:
// Initialize Hono app
const app = new Hono<{ Bindings: Env }>();
// Middleware
app.use(
"*",
cors({
origin: (origin) => {
if (origin && allowedOrigins.includes(origin)) {
return origin;
}
return "";
},
allowMethods: ["GET", "POST", "DELETE", "OPTIONS"],
})
);
// Routes
app.get("/riddles", async (c) => {
const riddles = await new RiddleFetcher(c.env).fetchDaily();
return c.json(riddles);
});
// 404 Handler
app.notFound((c) => c.json({ error: "Not Found" }, 404));
// Error Handler
app.onError(async (err, c) => {
await notifyOnTelegram(c.env, `Error: ${err.message}`);
return c.json({ error: "Internal Server Error" }, 500);
});
// Export handlers
export default {
fetch: app.fetch,
async scheduled(event, env, ctx) {
await new RiddleManager_V2(env).manageGeneration();
},
} satisfies ExportedHandler<Env>;
As can be seen by the code above, the worker handles two processes:
- A Single HTTP request that fetches the daily riddleset that players see
- A Cron job that runs that runs every two minutes for two hours, once every two months.
All errors are logged and a Telegram notification is sent so I’m directly notified on my phone!
CloudFlare Workers KV - Sweet Like Kiwi
The database of choice was Cloudflare Workers KV, which is CF’s NOSQL db, which was fairly straight forward to work with. For folks that need their data to be ACID compliant, look towards Durable Objects or D1, for the rest of us, KV gets the job done. Note however, it is very much eventually consistent and I saw a 3-5 second delay for the saved data to proliferate throughout the network, meaning that if a user was to save the data and immediately refresh the page, they’ll still see stale data, which may cause all sorts of unhappiness. You’ll want to employ a client-side optimistic update strategy.
Misadventures with OpenAI
As a refresher, Wordazzle presents the player with seven new riddles everyday, where each riddle is presented as an object with the following schema:
type Riddle = {
word: string;
question: string;
example: string;
};
type RiddleSet = {
riddles: Riddle[];
key: string;
};
Each RiddleSet
is stored in Cloudflare KV, with the key being the date in YYYY-MM-DD format. This makes retrieving the daily riddles a simple key-value lookup—ideal for a low-latency, serverless backend.
AI Prompt Engineering: Or Learning Things The Hard Way
One of the biggest lessons I learned while generating riddles programmatically was this:
If you want the model to output complex structured data and aren’t getting the desired results, break the process into multiple stages (i.e., multiple prompts).
This approach provides multiple benefits:
- Lower Token Consumption – Each prompt operates on a smaller context, reducing input tokens and improving model efficiency.
- Focused Prompts – Instead of forcing a single prompt to generate everything (word, question, and example), each step focuses on one specific subtask, improving accuracy and consistency.
Initially, I had one monolithic prompt that attempted to generate an entire set of riddles in a single request. Here’s a snippet:
For example, initially I had one Prompt to generate an array of riddles, and here is a very small snippet from the prompt:
- "question": A question crafted based on the word's dictionary definition, starting with "What [LETTER]...".
- "answer": The word that matches the question, all uppercase.
- "example": An example sntence that uses the answer in a sentence`;
I learnt this the hard way, but this was far too much cognitive load for poor old GPT, which struggled to balance coherence with creativity across generations.
To solve for this, I imeplemented a state machine to orchestrate the riddle generation process in sequential steps. The manager classes coordinates the workflow, this way each component(word selection, question generation, example crafing) can run independly and output it’s result to the manager, which then handles it accordingly.
At a high level, the mgrStep
object tracks the current state of the riddle generation pipeline. The process consists of four distinct stages:
async manageGeneration() {
try {
const isBlocked = await this.isBlocked();
if (isBlocked) {
return;
}
const mgrStep = await this.getMgrStep();
if (mgrStep.step === 'COMPLETE') {
if (this.shouldGenerateNewSet(mgrStep.latestRiddleSetKey)) {
await this.updateMgrStep(mgrStep);
}
} else if (mgrStep.step === 'FETCH_WORDS') {
await this.manageWordGeneration(mgrStep);
} else if (mgrStep.step === 'FETCH_QUESTIONS') {
await this.manageQuestionGeneration(mgrStep);
} else if (mgrStep.step === 'FETCH_EXAMPLES') {
await this.manageExampleGeneration(mgrStep);
} else if (mgrStep.step === 'BUILD_SET') {
await this.buildRiddleSets(mgrStep);
} else {
throw Error(`Unknown step: ${mgrStep.step}`);
}
} catch (err: unknown) {
await new ErrorHandler(this.env).handleError(err);
}
}
My AI Pipeline For Riddle Generation
-
Word Selection – The system first generates a word bank of 300 words per letter (fetching 50 words at a time). This stage required extensive prompt tweaking and temperature tuning, as word diversity is critical for game replayability.
-
Question Generation – Once the word set is built, we generate dictionary-inspired riddles for each word. This step ensures that questions are phrased consistently, following the “What [LETTER]…” format. Example Sentence Generation – For each word, the model crafts a contextual example to reinforce meaning.
-
Example generation was another area where prompt engineering played a crucial role—forcing the model to avoid generic sentences and create meaningful usage examples.
-
Set Assembly – The generated words, questions, and examples are compiled into a structured RiddleSet, and then stored in a KV store
That’s a deep dive into the architecture and lessons learned while building Wordazzle. The journey involved juggling frontend technologies, serverless backend innovations, and advanced AI prompt engineering—all orchestrated by a robust state machine. I hope these insights help fellow developers navigate their own side projects with confidence and creativity.