I try, then again I don't try.


What am I even doing?

I was driving to work this morning and decided to shuffle all songs on my phone rather than listening to the usual dozen. The very first song was one I hadn’t listened to in several years. “F For Effort” by Les Vinyl (they’re pretty good you should check them out). When the song ended I played it again, and again, and a third time really listening to the lyrics. The opening verse of the song kept hitting me like a truck:

I heard that Rome wasn’t built in a day Then how come everyone is rushin’ to get ahead?

It’s not incredibly profound or anything but perfectly sums up my recent feelings on just about everything in life at the moment. Everything has a time limit, everything must be done right now, everything must be faster, everything must be more efficient. There’s a particular kind of fatigue that doesn’t necessarily come from working too hard, but from watching the tasks you have worked hard to master become trivial and low effort. I’m a software engineer in the age of generative AI and I have to be honest I feel like I’m drowning.

Until recently the use of generative AI for anything but the most menial tasks like text formatting or boilerplates was seen as “cheating”. If you asked ChatGPT to write your code it was expected to be clunky and incoherent. I admit I enjoy playing around with generative AI as much as the next person but never thought it would be much more than a toy. That curiosity has become uncertainty and occasionally outright fear. The fear of becoming obsolete or the fear of how I’ll be able to provide for myself or my family when the skills I’ve been developing for more than a decade suddenly become reproducible with the press of a button.

“Just tell Claude to do it.”

or

“This should only take Cursor a couple of minutes.”

And they’re not wrong. It only takes minutes (sometimes seconds) for the robot-of-the-month to produce something that would have taken me much longer - sometimes hours or even days. Not just boilerplate but something coherent, compilable, runnable, even impressively “good”. Good enough that the room oves on anyway. “Good enough” that nobody really gives a shit how it works, only that it exists. That’s the part that quietly drains my motivation.

I’m not saying I’m the best developer in the world, I’m FAR from it, but I have spent so much time and effort to gain the skills I currently have and have been very fortunate that those skills allow me to afford a roof over my head and food on my table - or the frivolous roleplaying books and miniature robots that litter my dining room table. Skills that can now be reproduced with a prompt.

For my entire life (and likely yours) learning meant leverage. You struggle through documentation, you broke things, you fixed them and gained valueable experience. You understood that time invested would compound and your skills would improve making each new problem or task that much easier and more elegant. It is a reward loop you can experience again and again and it feels good. AI broke that loop. Why spend evenings learning a new language or framework when the answer is going to be “ask the robot”? Why refine instincts around performance, maintainability, acessibility, when the output is only measured in how fast it ships rather than the quality of the code underneath? What are you supposed to do when the scope shifts from “How do we build this well?” to “How quickly can we churn this out?”

What is created is a strange sense of redundancy. Not just of labor but of curiosity and creativity.

The most demoralizing moments aren’t when AI does something better than I could. They’re when it does something good enough that nobody cares how it was made. When craftsmanship becomes invisible, and learning feels indulgent rather than essential. When the quiet pride of knowing why something works is replaced with the loud efficiency of copy, paste, ship. It’s incredibly hard not to internalize that feeling.

You start wondering if improving your own skills is a bad investment. If depth has become optional. If the future belongs to those who prompt fastest, not those who understand most. There’s a creeping voice that says: stop trying so hard. Stop caring. The machine has you covered. That voice is a liar (for now, anyway).

I truly feel there’s a difference you can feel between software that merely functions and software that respects its users. Between systems that technically work and systems that make sense. Between code that exists and code that communicates. AI is extraordinary at producing output. It is less good at producing intent.

AI doesn’t know who will maintain this in a year. It doesn’t feel the frustration of a confusing interface or the relief of a clear one. It doesn’t carry responsibility when a small decision cascades into a large failure. Humans do; that’s where motivation, as fragile as it feels right now, comes from. Maybe the work that still matters is slower, quieter, and harder to measure: choosing clarity over cleverness, empathy over optimization, understanding over novelty.

Writing software for humans. Software that leaves a trace of care for its users; care that no model, no matter how capable, actually feels.

Learning still matters and since I don’t have much of a choice my 2026 resolution it to let AI take the grunt work, the scaffolding, the repetition, etc and to focus my energy where it can’t replace me yet: in empathy, in responsibility, in the small decisions that shape how something feels to use.

  • Travis