
Career
I've been using AI tools almost every day now, Claude, Midjourney, Cursor, ElevenLabs, Pomelo, Stitch, OpenClaw, video generation platforms, you name it. Somewhere between prompting and generating, a strange feeling crept in. Are we actually creating anything anymore? Or are we just... typing thoughts into a box and watching machines do the rest? I sat with that discomfort for a while and then I realised - this feeling isn't new. Not even close.
The pattern you're sensing is real - and it's happened before every major technological wave has triggered this exact existential discomfort:
Agricultural revolution → "Machines now plow. What am I?"
Industrial revolution → "Factories now produce. What am I?"
Computing/90s → "Computers now calculate. What am I?"
Internet → "Machines now distribute information. What am I?"
AI now → "Tools now create. What am I?"
Each time, the question felt terminal. Each time, humans adapted - but not by going back. By moving up the value stack. What actually changed each time. The pattern is consistent:
Machines took over the physical/mechanical layer (factories replaced manual labor)
Computers took over the computational layer (spreadsheets replaced accountants doing arithmetic)
AI is now taking over the cognitive-execution layer (writing, coding, designing, analysing)
What's left? What humans keep moving to: Judgment. Taste. Context. Stakes. Accountability. Meaning. You're not "just giving input." You are the one who decides what matters, what's good enough, what direction to go, and what the output is actually for. That's not trivial - that's the hardest part. Why it feels like "doing nothing" Because we've historically confused effort with value.
A lawyer who takes 3 hours to draft a contract felt more valuable than one who uses AI to do it in 20 minutes - but the client doesn't care about the 3 hours. They care about the outcome.
A designer who spent weeks in Photoshop felt more creative than one using AI tools - but the question is: whose vision was better?
AI is decoupling effort from output. That's disorienting, but it's not the same as being useless. The real risk - and it's legitimate. Here's where I'll be direct: your concern isn't unfounded at a macro level. If the only value you add is prompting -giving generic inputs to get generic outputs - then yes, that's a thin position. Anyone can do that. The people who will matter in an AI-saturated world are those who bring:
Domain depth - knowing when the AI is wrong, shallow, or missing context
Taste - knowing what "great" actually looks like, not just "acceptable"
Strategic framing - asking the right question, not just any question
Accountability - being the human who owns the outcome
The risk isn't that AI replaces you. It's that undifferentiated humans get replaced - people who can't add judgment above what the AI produces. In the next few years, AI will handle most of the doing, writing, coding, analysing. Humans will own the thinking, direction, judgment, and what's worth building. The biggest gap won't be between humans and AI. It'll be between humans who know how to think clearly, and those who don't.
Bottom line for you, today you're not doing nothing but the game has changed:
Old value: how well can you execute?
New value: how well can you think, judge, and direct?