At some point this week, I watched the navigation bar on my live website slide to the far left of the screen and disappear halfway off the page. The site had been live for clients to see. I had no developer to call.

I opened Claude and typed: the nav is broken.

Twenty minutes later, it was fixed, committed to GitHub, and deployed. That’s the version of this story that makes a good LinkedIn post. But it’s not the whole story, and the whole story is more useful.

What I actually built

This website — the one you’re reading — is a production site. It runs on Astro, a modern static site generator. It deploys automatically to Netlify every time a change is pushed to GitHub. It has a working CMS for editing content without touching code. The contact form sends submissions to Klaviyo and emails me a notification. There’s a Cal.com booking embed, structured data for search, a mobile menu with a focus trap for accessibility, and a newsletter signup that feeds a separate Klaviyo list.

None of that is a template. It was built from scratch.

I did not write the code. I do not have a development background. What I have is a clear sense of what I want, the ability to ask precise questions, and a high tolerance for iteration when things don’t work the first time.

The technology did the building. I did the thinking. That division of labor is harder to maintain than it sounds.

What AI collaboration actually looks like

The version of AI-assisted work that gets talked about in the abstract — “just describe what you want and it builds it” — is real, and it’s also incomplete. What it leaves out is the part where you have to know what you want precisely enough for it to matter.

When I started this project, I knew I wanted a dark, editorial aesthetic. I knew I wanted case studies, industry pages, a blog, and a contact form with Calendly integration. I knew I didn’t want a WordPress site or a Squarespace template. That specificity was the input that made the output useful.

What I couldn’t have told you at the start was how a form submission gets from a browser to a Klaviyo profile. Or why a navigation menu that works on desktop might need to live outside the <nav> element on mobile because of something called a backdrop-filter stacking context. Or that when your form JavaScript refers to this.name, it’s reading the form’s own name attribute — “contact” — instead of the input field named “name”, and that this difference would silently cause every Klaviyo submission to fail while still redirecting users to a success page.

That last one cost us an afternoon.

The point isn’t that AI got it wrong — the point is that building anything real means encountering edge cases, and edge cases require diagnosis. Claude walked me through every one of them. But I had to stay in the room, ask follow-up questions, and recognize when “it should be working now” needed to become “let’s add logging and look at what’s actually happening.”

When it broke, I had to think

The Klaviyo integration is a good example. The contact form was submitting. Netlify was receiving it and sending me email notifications. But submissions weren’t showing up in Klaviyo. The form was going to a success page. There was no visible error anywhere.

We worked through it in layers. First we confirmed the serverless function was being invoked at all. Then we found the environment variable had the wrong name because I had followed an earlier instruction that turned out to be incorrect. Then we found the API key value itself hadn’t been saved when I thought I’d saved it. Then — after all of that was resolved — we found the silent field-reading bug that had been there since the beginning.

None of those problems were complicated once we found them. All of them required paying attention.

That’s what I’d tell anyone who is considering using AI to build or maintain their marketing infrastructure: it is genuinely capable of building things that work. But “things that work” requires you to care whether they actually work, not just whether they appear to. The gap between a form that redirects to a success page and a form that actually delivers data to your CRM is invisible if you’re not checking.

The part that surprised me

I expected the code to be the hard part. It wasn’t.

The hard part was learning enough to ask good questions. Not technical questions — I still can’t write a line of JavaScript from scratch — but specific ones. What does a 409 response mean? Why would an environment variable be set but return false? What’s the difference between a variable name and a variable value?

That kind of fluency — not expertise, just fluency — is available to anyone willing to engage with the work instead of delegating it entirely. And it compounds. The second problem is easier to diagnose than the first because you’ve developed a mental model of how the system fits together.

You don’t need to know how to build it. You need to know how to think about it. Those are different skills, and the second one is learnable faster than the first.

What this means if you’re a founder watching from the sidelines

I work with clean energy companies whose founders are doing the same calculation I was doing six months ago: real website versus quick template, proper CRM integration versus a spreadsheet, scalable marketing infrastructure versus something held together with Mailchimp and hope.

The calculation used to be: do I hire a developer, or do I wait until we can afford one? That’s no longer the right question.

The right question is: am I willing to stay engaged enough to make AI collaboration actually work? Because that’s the constraint now — not access to capability, but willingness to direct it.

A template gets you 80% of the way to a website that looks fine. What it doesn’t get you is the infrastructure that makes your marketing machine run: the list segmentation, the form-to-CRM flow, the content structure that makes SEO compoundable, the analytics hooks that tell you which content is doing work. Those things require decisions, and decisions require someone who understands the goal well enough to make them.

That person is you. Claude is very good at the part that comes after.

The honest version

There were moments in this process that were frustrating. Moments where I thought something was fixed and it wasn’t. Moments where I followed an instruction, confirmed I had followed it, and then had to confirm again because a step had been missed somewhere. Moments where the problem turned out to be embarrassingly simple — a forgot-to-hit-save that cost an hour of debugging.

That’s not a failure mode of AI-assisted work. That’s what building anything looks like.

What made it work was treating the process the same way I’d treat any complex project: break it into pieces, verify each piece actually works before moving to the next, and don’t mistake a success page for a successful outcome.

The navigation bar is straight. The Klaviyo integration is running. The site is live.

And I didn’t have to wait for a developer.

— END —

Want these insights in your inbox?

Twice-monthly clean energy marketing analysis. No fluff.

Subscribe →