Vibe Coding with AI: A Cautionary Tale from a Software Engineer

I’ve been a professional software engineer for years, but like many of us, I’ve been eyeing the growing capabilities of AI coding assistants with a mix of curiosity and skepticism. Recently, I decided to take the plunge and experiment with “vibe coding” — that is, offloading much of the heavy lifting to an AI model while I try to guide it along with well-crafted prompts. The idea was to build something that would normally take me a few weeks, using some tools I wasn’t fully fluent in and let the AI fill in the gaps.

What followed was both enlightening and infuriating. Here are a few hard-won lessons from my time vibe coding:

1. It Doesn’t Listen

I asked for an Angular app. It gave me React. Then Next.js. Then back to React, this time with some Tailwind sprinkled in for no reason. I revised the prompt, clarified the request, even pasted in Angular docs as context — it didn’t matter. The AI had a vibe of its own. I finally got what I wanted.

2. It Doesn’t Learn from Its Mistakes

The most frustrating part wasn’t the initial mistake — it was the repetition. I’d point out what was wrong. It would apologize, fix something tangential, and then repeat the exact same broken pattern in the next generation. When humans get feedback, we usually course-correct. The AI? It spins its wheels in the same ditch.

3. Don’t Ask Too Much

Asking an AI to build multiple components, handle routing, and wire up services in one go? Good luck. It starts strong, stumbles halfway through, and by the end, it’s hallucinating interfaces and inventing properties you never mentioned. If you ask for too much at once, it will give you everything except what you need.

4. Claude 4 > GPT-3.5 (By a Lot)

If you’re using 3.5, you’re wasting your time. Claude 4, in my experience, showed noticeably better reasoning, followed instructions more reliably, and generally gave me fewer gray hairs. It still struggled, but at least it wasn’t gaslighting me with bad code while claiming everything compiles.

5. Coding Is Being Replaced — By Prompting

It’s wild to realize how much my work during this experiment shifted from coding to coaching. Writing clear, structured prompts that break the problem into digestible steps became the most valuable skill. I wasn’t writing code so much as debugging the AI’s brain.

6. My Job Is Safe (For Now)

Yes, AI can churn out boilerplate and scaffold things quickly. But ask it to design a resilient architecture, make thoughtful tradeoffs, or anticipate edge cases? It can’t. Not yet. Vibe coding gets you a prototype, not a product.

7. Management Will Love It… Until It Breaks

On the surface, AI looks like a productivity miracle. Devs are faster! Code appears like magic! But when that code starts failing in weird, intermittent ways and no one understands what the AI was trying to do — that’s when the real cost emerges. Debugging generated spaghetti is harder than writing it yourself.

8. Unsolicited “Help” Is a Double-Edged Sword

Sometimes the AI does things you never asked it to — auto-generating types, refactoring code, renaming variables, etc. Occasionally it’s useful. More often, it breaks things that were already working. Like a junior dev who tries to be clever without understanding the system and knows how to use google and StackOverlow really, really, REALLY fast.


Final Thoughts

Vibe coding isn’t useless. It’s fast, it’s occasionally brilliant, and it’s undeniably a glimpse into the future of software development. But for now, it’s like hiring a very enthusiastic intern with no memory, a short attention span, and an overinflated sense of confidence.

We’re not out of a job. But we are learning a new one — one where the real skill is not in what you code, but in how well you can guide an AI through the fog.

Oh — and in the spirit of full transparency: I asked ChatGPT to help write this post based on my bullet-point rant. It mostly listened this time.

Cost-Effective Solution for Archiving Final Cut Pro Libraries

The Problem

For my YouTube channel, I have been using Final Cut Pro on my Mac to assemble and edit all of my videos. I generally create one library for each video, sometimes I will create multiple videos in one library, but all on the same subject. It often matters as to when I think I will be recording and editing them. I have no hard set rules.

When I am working on these videos, I keep the Final Cut library on a faster Solid State Drive. But when I am finished, I move it to one of two 8TB external Hard Disk Drives as an archive/backup. But these drives are starting to fill up.

What I need is a place to archive these old projects, so I never lose them. If I need them, I probably won’t need them right away. And a way I can do that cheaply. I could just buy another hard drive, but my worry is that they can and do fail. Even if just sitting on a shelf. I’d still need some other kind of backup for my backup.

Commercial Online Cloud Backup

I started looking at a commercially available cloud based solution. The ones I look at have some pretty steep prices, but they seem to be targeted to the user that might need ready access to the files in the archive, or even unattended backup/archiving of files. That’s not what I am looking for.

Amazon Web Services

I am an AWS Solution Architect so I started looking at the cost of storing these files in Amazon’s Simple Storage Service, or S3. There are several different storage tiers with S3. And some of them can be costly.

For example, if I were to upload all 16TB of archived projects I have right now, it would cost well over $350 a month for the standard, ready access storage class. But I don’t need that. I’m only interested in moving the files up that I haven’t looked at for a year or more. And I probably will never need to look at them. But I’d like to be able to, even if I don’t need them right away.

The least expensive S3 storage class is called Glacier Deep Archive. That same 16TB of storage would only be a little more than $15 a month. Which is perfect for me. Even storing 100TB is only around $100 a month, 1/3rd the price of the standard tier S3 tier and 1/6th the price of the more popular online backup solution. I’m not saying those other options aren’t worth it, they are for what they provide, which is a lot more than what I need.

My Solution

So I set off, with the help of AI, to write a script that will scan a directory and upload my Final Cut libraries to Amazon S3 Glacier Deep Archive. I wrote it in BASH to make it portable to other operating systems and for other uses. So far, I have uploaded well over a terabyte of data to S3 and I might be paying $1.00 for the month. It is going to take me a while to upload all of these files, but it sits and does it in a terminal window.

The script scans the directory structure, locates any Final Cut libraries, creates multi-part zip files using 7Zip, then used the AWS S3 API to upload the multiple parts. Once all parts have been uploaded, it’s completed. I still manually verify that the file made it before I delete the local version. If it fails at any point during the upload, I can execute the script again and it will skip all of the parts that have already been uploaded.

I’ve uploaded the script to a public github repository. It is licensed under the Creative Commons NonCommercial License if you want to check it out for yourself. I enjoyed writing this and it’s been nice getting some of that hard drive space back.

Assembling the Bridgeport Mill Quill Housing


Time to start putting things back together. The parts for the quill housing are all clean and ready to be assembled.

I mention H&W Machine Repair and Rebuilding quite a bit throughout this particular set of videos. I could not have had the confidence to take on this project if it wasn’t for the videos that Barry has provided to the rest of us here on YouTube. I cannot thank them enough.