---
title: "Of Course I Used AI"
date: "2026-03-18T04:59:34+00:00"
url: "https://invis.net/of-course-i-used-ai/"
author: "invisnet"
license: "CC BY-ND 4.0"
license_url: "https://creativecommons.org/licenses/by-nd/4.0/"
site: "invis.net"
copyright: "Copyright 2024-2026 Charles Lecklider. All rights reserved."
disclaimer: "Personal website. Opinions are my own."
categories:
  - "Blog"
tags:
---

# Of Course I Used AI

I recently rebuilt my personal website from scratch — new theme, new design, the lot. The first question anyone asks is: "Did you use AI?"

Yes. Of course I did. But that's about as meaningful as "did you use a keyboard?" The interesting questions are what changed — about experimentation, about process, about the economics of getting things wrong, and about something I didn't expect to get back.

## The analogy

I haven't seen anyone else use this one yet, which surprises me, because it's the most honest description I've found.

For a developer, AI is like handing a Bronze-Age stonemason a modern steel chisel. The tool is genuinely transformative. The cuts are cleaner, the work is faster, you can attempt details that would have been impractical before. But it's only transformative because the stonemason already knows how to carve. Hand that chisel to someone who's never worked stone and they'll just make mistakes faster.

The chisel doesn't know what a cathedral looks like. The stonemason does. And nobody needs to teach the stonemason how to hold a new chisel — they pick it up and their hands already know what to do with it.

## What I actually built

I should set expectations here: the site is deliberately minimal. It's a TUI-inspired WordPress theme — no JavaScript, no third-party anything, no tracking. It exists to give me somewhere to present my ramblings, not as a CSS3 torture test. The design is essentially a table layout that 1997 would recognise, except it's built on CSS Grid and semantic HTML, which means it's only recently been possible to do it *properly*.

The whole thing was one long day's work. I didn't start with a specification. I didn't have a finished design in my head. I did have thirty years of knowing what a well-structured site looks like, a rough direction, and the ability to recognise good output when I saw it.

That turned out to be enough.

## The economics of "what if?"

This was the single biggest shift. Before AI, experimentation forced a bad choice that nobody really talks about.

You can build a quick prototype. Duct tape, shortcuts, bodges, just enough to get the idea across. Except a duct-taped prototype lies to you — it can't tell you whether the *real* implementation will work, because it isn't one.

Or you can build the real thing. Now you've invested enough time that you're emotionally committed to making it work, even if it turns out to be the wrong thing. Sunk cost gravity takes over.

AI broke this completely. "What if I try this?" became a question I could answer *properly* — real implementation, correct structure, full semantics — and if it didn't work, `git checkout` and I'd lost twenty minutes. The cost of being wrong dropped to nearly zero.

The key discipline: commit to git first, then have at it. I wasn't letting AI throw stuff at the wall. I was checkpointing, exploring a real branch, evaluating with full information, keeping or discarding. That's a methodology, not chaos — it just happens to be one that wasn't economically viable before.

## 80/20 is dead

The old 80/20 rule — 80% of the work done in 20% of the time, with the painful tail consuming the rest — doesn't map to this at all. Both ends collapsed. The scaffolding and boilerplate that used to eat the first phase? Gone. The edge cases and polish that used to make the last 20% agonising? AI eats those for breakfast: describe the problem, generate the fix, verify.

What's left isn't a ratio. It's a different shape entirely — short, roughly equal cycles of build, evaluate, adjust. The cost of each cycle dropped so far that the total number stopped mattering. There are still things that need tweaking, but now it's "that bit isn't working properly" or "we need to add X" — I hold the reins and enjoy the scenery.

The irreducible part is judgment. Noticing something's wrong. Spotting the gap. Understanding the system well enough to ask the right next question. No tool ratio can describe that, because it's not effort — it's expertise.

## Stop nagging the AI

If AI is the steel chisel, then process is for ordering the stone. Scope, budget, deliverables, timeline — that's procurement. It is not for standing over the stonemason and dictating chisel technique. It is not for requiring the stonemason to estimate how many chisel strokes per hour, file a daily chisel report, and attend a biweekly chisel retrospective.

The instinct right now is to nag the AI. Make sure you write tests. Make sure you update the comments. Make sure you follow the coding standards. Make sure the documentation is current. Make sure you tidy your room. Every prompt stuffed with reminders, every output audited against the full checklist before moving on.

Many people stopped asking *why* we do these things a long time ago. They are rituals — things you do because you do them, because that's how software is made.

And when applied to people, it probably doesn't matter if you've forgotten why. The reasons are still valid even if you never knew them. Humans write tests and update comments and perform all the other rituals as we go because we *have to*. Defer it and rebuilding working memory is often more effort than writing the code in the first place. The rituals work for people, even on a Friday afternoon.

AI is fundamentally different. It doesn't forget because it doesn't remember. Starting again with fresh context is cheap. No human working memory to rebuild, no lost afternoon reconstructing what on earth you were thinking. The comments are out of date? Tell it to fix them while you make a coffee. The tests need updating? Describe what changed and let it work.

The cost of nagging for everything on every single pass is now *higher* than the cost of batching it up and doing it when it makes sense. A complete inversion. And when the AI doesn't follow the process, the response is to reach for a bigger hammer. More rules in the system prompt. More checklists. More auditing. *More* process.

A people-shaped process *cannot* work for AI, no matter how big a hammer you use.

You ordered a bust of Medusa. A picture of Mr. Blobby isn't helpful.

## The fun came back

I got into all things IT — really, all of it: networks, databases, sysadmin, security, dev, you name it — because it was fun. The thing that made people good at this was never discipline or process. It was staying up until 2am because you wanted to see if something would work. Curiosity as a driving force.

Crappy misinterpretations of Agile stole that. Unrealistic expectations stole that. Estimation culture turned "what if we try this?" into a question with budget implications. "I wonder whether..." became "is this in scope?" Nobody stole the fun deliberately — it just got optimised out, one sprint at a time.

What AI gave back, at least on my own site and on my own terms, is the loop that made it fun in the first place. Curiosity, experiment, result — fast enough that the momentum carries you. The gap between "I wonder if..." and seeing it on screen got short enough to stay in the flow instead of losing it to yak-shaving.

Other than a couple of times the AI painted itself into a corner while I wasn't watching, the whole experience was genuinely enjoyable. That's worth saying out loud, because I think a lot of people in this industry quietly remember that this used to be fun and aren't quite sure when it stopped.

## The chisel, not the sculptor

The steel chisel can run away from you. I found that out too — turn your back and it'll cheerfully paint itself into a structural corner that takes real effort to undo. You still need to be paying attention.

You know what I didn't do? Think about how I was prompting the AI. Not once. It never crossed my mind. I described what I wanted in the same way I'd describe it to a competent colleague, and that was enough. The entire "prompt engineering" industry is selling chisel-holding lessons to people who've never carved stone. If you need a six-part LinkedIn course on how to talk to the tool, the tool isn't your problem.

But when you're engaged and steering, this is the closest thing to the reason I started doing this that I've found in a long time. The tool changed. The craft didn't.

That, I think, is the point.


---
Copyright 2024-2026 Charles Lecklider. All rights reserved.
Personal website. Opinions are my own.
