Vibe Coding with AI: A Cautionary Tale from a Software Engineer

I’ve been a professional software engineer for years, but like many of us, I’ve been eyeing the growing capabilities of AI coding assistants with a mix of curiosity and skepticism. Recently, I decided to take the plunge and experiment with “vibe coding” — that is, offloading much of the heavy lifting to an AI model while I try to guide it along with well-crafted prompts. The idea was to build something that would normally take me a few weeks, using some tools I wasn’t fully fluent in and let the AI fill in the gaps.

What followed was both enlightening and infuriating. Here are a few hard-won lessons from my time vibe coding:

1. It Doesn’t Listen

I asked for an Angular app. It gave me React. Then Next.js. Then back to React, this time with some Tailwind sprinkled in for no reason. I revised the prompt, clarified the request, even pasted in Angular docs as context — it didn’t matter. The AI had a vibe of its own. I finally got what I wanted.

2. It Doesn’t Learn from Its Mistakes

The most frustrating part wasn’t the initial mistake — it was the repetition. I’d point out what was wrong. It would apologize, fix something tangential, and then repeat the exact same broken pattern in the next generation. When humans get feedback, we usually course-correct. The AI? It spins its wheels in the same ditch.

3. Don’t Ask Too Much

Asking an AI to build multiple components, handle routing, and wire up services in one go? Good luck. It starts strong, stumbles halfway through, and by the end, it’s hallucinating interfaces and inventing properties you never mentioned. If you ask for too much at once, it will give you everything except what you need.

4. Claude 4 > GPT-3.5 (By a Lot)

If you’re using 3.5, you’re wasting your time. Claude 4, in my experience, showed noticeably better reasoning, followed instructions more reliably, and generally gave me fewer gray hairs. It still struggled, but at least it wasn’t gaslighting me with bad code while claiming everything compiles.

5. Coding Is Being Replaced — By Prompting

It’s wild to realize how much my work during this experiment shifted from coding to coaching. Writing clear, structured prompts that break the problem into digestible steps became the most valuable skill. I wasn’t writing code so much as debugging the AI’s brain.

6. My Job Is Safe (For Now)

Yes, AI can churn out boilerplate and scaffold things quickly. But ask it to design a resilient architecture, make thoughtful tradeoffs, or anticipate edge cases? It can’t. Not yet. Vibe coding gets you a prototype, not a product.

7. Management Will Love It… Until It Breaks

On the surface, AI looks like a productivity miracle. Devs are faster! Code appears like magic! But when that code starts failing in weird, intermittent ways and no one understands what the AI was trying to do — that’s when the real cost emerges. Debugging generated spaghetti is harder than writing it yourself.

8. Unsolicited “Help” Is a Double-Edged Sword

Sometimes the AI does things you never asked it to — auto-generating types, refactoring code, renaming variables, etc. Occasionally it’s useful. More often, it breaks things that were already working. Like a junior dev who tries to be clever without understanding the system and knows how to use google and StackOverlow really, really, REALLY fast.


Final Thoughts

Vibe coding isn’t useless. It’s fast, it’s occasionally brilliant, and it’s undeniably a glimpse into the future of software development. But for now, it’s like hiring a very enthusiastic intern with no memory, a short attention span, and an overinflated sense of confidence.

We’re not out of a job. But we are learning a new one — one where the real skill is not in what you code, but in how well you can guide an AI through the fog.

Oh — and in the spirit of full transparency: I asked ChatGPT to help write this post based on my bullet-point rant. It mostly listened this time.

Feel free to leave a comment or question

This site uses Akismet to reduce spam. Learn how your comment data is processed.