The Ladder Is Gone. Now What?
Every time a technology killed jobs, we had the same answer: move up. AI broke the ladder. For the first time in industrial history, there's no obvious floor above.
Every time a technology killed jobs, we had the same answer.
Move up.
Machines took the farms? Go to the factory. Robots took the factory? Go to the office. The assembly line doesn't need you anymore? Learn Excel. Get a degree. Upskill. Repeat until retirement or death, whichever comes first.
For a hundred years, this worked. There was always a rung above. The displaced could climb. The economy reshuffled, people retrained, universities printed degrees like they were going out of style (they were), and after a painful decade or two, things stabilized.
AI broke the ladder.
Not a rung. The whole thing.
The cognitive middle is the target
Previous revolutions ate from the bottom. Muscles first, then repetition, then manual skill. The progression was predictable. If a machine could do your job, you moved to a job that required something a machine couldn't do: think.
That was the deal. Think and you're safe.
I spent a decade believing this. I built a career on it. And now I'm watching it unravel in real time.
AI doesn't start at the bottom. It starts in the middle. The analysts. The paralegals. The junior engineers. The copywriters. The financial modelers. The consultants who package other people's research into slide decks and charge $300 an hour for the privilege. The entire cognitive middle class that universities spent fifty years training people to join.
Goldman Sachs says 300 million jobs. The World Economic Forum says 92 million roles displaced. Dario Amodei, the CEO of Anthropic (the company whose model I use to do my own job faster), says 50% of entry-level white-collar jobs gone in five years. You can argue about the exact number. You can't argue about the direction.
For the first time in industrial history, there's no obvious floor above. Just ceiling.
Why "learn to code" is now a joke
Five years ago, "learn to code" was genuine advice. The world needed developers. Software was eating everything. A bootcamp graduate could land a $70K job in six months. I personally told at least a dozen people to do this. I owe some apologies.
Today, 41% of all code worldwide is AI-generated. Anthropic's own engineers say they rarely write code by hand anymore. OpenAI just told every internal team to make agents the default tool for any technical task. Not by next year. By March 31. As in, next month.
The people who told you to learn to code are now telling their own engineers to stop coding. Let that marinate.
So where do the displaced analysts go? What does a paralegal retrain into when the LLM reviews contracts faster and cheaper? What does a junior consultant become when the AI generates the market research, the financial model, AND the presentation? "More strategic"? Please.
The standard answer is "prompt engineering" or "AI management" or some other freshly invented title that sounds like a career but is really a group coping mechanism with a LinkedIn badge. Like telling horse stable workers to become "automotive transition specialists." The horse doesn't care about your new title.
The $1.7 trillion question
The US has $1.7 trillion in outstanding student loan debt. That number was built on a promise: get a degree, get a knowledge job, pay back the loan. A beautiful little conveyor belt of obedience and optimism.
If the ROI on a four-year degree in accounting, paralegal studies, financial analysis, or basic software development turns negative, that $1.7 trillion becomes a time bomb. Not because people can't pay. Because the jobs the degrees pointed to are evaporating while the debt stays perfectly solid.
India has its own version. $283 billion in IT services, built on the premise that five million people can code cheaper than Americans. AI just said "I'll do what they do, but free. And I don't need a cafeteria." TCS, Infosys, Wipro. The body shop model is staring at the same broken ladder. Their people can't "upskill" into something above knowledge work because there's nothing above knowledge work that operates at that scale.
This isn't a prediction. The $285 billion wiped from IT services stocks on February 3, 2026 was the market doing the math in real time. Markets don't write think pieces. They just price in the obituary.
The asymmetry nobody talks about
Here's where the doom narrative gets it half right and half dangerously wrong.
The doom version: AI replaces everyone. Mass unemployment. Government hands out checks. Techno-feudalism. Very cinematic. Great for conference talks.
The reality is stranger. And more unequal. And way less tweetable.
Marc Andreessen said it on Lenny's podcast last week: "The really great people are becoming spectacularly great. They're not twice as good. They're 10 times as good."
AI is a multiplier. Not an adder.
If you know nothing and you get AI, you can build a basic app. Congrats. So can ten million other people. Your output went from 0 to 1. You and every other person who watched the same YouTube tutorial are now in a gladiator pit fighting for the same scrap of attention.
If you have twenty years of domain expertise and you get AI, you can build things the 0-to-1 crowd can't even debug. Your output went from 8 to 80. Same tool. Completely different ceiling. It's like giving a Ferrari to a Formula 1 driver and a teenager who just got their permit. Same car. Very different lap times. Very different survival rates.
This is the asymmetry. AI doesn't flatten the playing field. It tilts it. Violently. In favor of people who already know things deeply.
A non-coder with ChatGPT builds a Gita app in a week. 819K views on Twitter. Impressive? Sure. Useful for six months? Maybe. Maintainable at scale? No. But hey, great thread.
Someone with twenty years in financial markets, plus AI, builds a backtesting system that processes decades of pattern data and surfaces signals the app-builder doesn't even know exist. Same tool. Different hands. Different outcome by orders of magnitude. One gets Twitter impressions. The other gets alpha.
The ladder isn't gone for everyone. It's gone for people who were on the middle rungs doing commodity knowledge work. For people with real depth, AI didn't remove the ladder. It installed an elevator. With a penthouse button.
So where do the displaced go?
Honest answer: I don't fully know. Nobody does. Anyone who tells you they have the complete answer is selling a course.
But the pattern of history offers a clue that most people miss because they're too busy panicking.
Every time production got cheaper, humans didn't stop working. They started spending on new things that didn't exist before. We're weirdly reliable that way.
Food got cheap. People didn't stop buying. They bought manufactured goods (which hadn't existed as consumer products before). Manufactured goods got cheap. People didn't stop buying. They bought services (which barely existed as an industry before). Information got cheap. People didn't stop buying. They bought experiences, convenience, and status (categories that would have confused someone in 1950). Every single time, the smartest economists alive said "this time it's different" and every single time they were wrong. Maybe they're right this time. I wouldn't bet on it.
The consumption category that absorbs the AI-displaced workforce doesn't exist yet. That's not a cop-out. That's the literal historical pattern. Every single time, the new category was invisible from the vantage point of the old one. You can't see the next floor when you're standing on the current one. That's not a bug. That's how it works.
In 1900, you couldn't have predicted that "social media manager" would be a job. In 1980, you couldn't have predicted that "UX designer" would be a career. In 2005, you couldn't have predicted that "YouTuber" would be a profession that pays more than surgery.
The new categories will emerge. They always do. The gap between "old jobs die" and "new jobs emerge" is where the pain lives. And this time, the gap might be shorter but more intense than any we've seen. Cold comfort if you're in the gap right now.
What actually survives
While we wait for new categories to crystallize, some things are already clear about what survives. Spoiler: it's not what LinkedIn influencers are telling you.
Domain expertise multiplied by AI. Not "I know AI." Not "I can prompt." That's the commodity. Everyone can prompt. My mom can prompt. What survives is "I know this specific domain so deeply that AI makes me dangerous." A radiologist who uses AI reads scans that no AI alone and no junior doctor can match. A trader with 20 years of pattern recognition plus AI backtesting finds signals invisible to both humans and machines working separately. The 8-to-80 people. They're not worried. They're thriving. It's almost annoying.
Judgment over execution. When code is cheap, knowing what to build matters more than knowing how to build it. When legal research is instant, knowing which argument wins matters more than finding the precedent. Execution becomes the easy part. Judgment becomes the scarce part. I used to bill for my hands. Now I bill for my taste. The hourly rate went up.
Trust and relationships. No agent closes a deal over dinner. No LLM navigates office politics. No AI understands why the client is really frustrated. Not what they said, but what they meant. Human trust is a moat that doesn't erode with capability improvements. Try getting an LLM to read the room when the CEO's smile doesn't reach their eyes.
Meaning-making. Philosophy, spirituality, purpose, community. AI can simulate empathy. It can't provide meaning. When knowledge work gets automated, the question "what is this all for?" gets louder. The people who help answer it, whether they're therapists, coaches, spiritual guides, or just good friends who pick up the phone at 2am, become more valuable, not less. The robots took the thinking. They left the feeling.
The honest position
I run an AI-native engineering consultancy. I use agents for 60-70% of my work. I've watched my own output multiply by 3-5x in the last six months. I am, by any reasonable definition, one of the people the strangulation thesis is about. I'm the engineer who uses AI to do the work of a small team. I'm part of the problem I'm writing about. I know. The irony is not lost on me.
And I'll tell you what I see from here: the doom narratives are half right. The displacement is real. The velocity of money problem is real. The upskilling dead end for commodity knowledge work is real. All real. All happening now. Not in some speculative future. Now.
But the assumption that this ends in techno-feudalism where citizens collect stipends and corporations own all the machines? That assumes AI stays expensive and centralized. It doesn't. Models are getting cheaper, open source is accelerating, and inference costs are collapsing. The feudalism thesis requires the serfs to stay unarmed. The serfs are getting the same weapons.
The more likely future: millions of individuals with domain expertise become one-person companies, directing their own agent swarms, competing with corporations on output if not on scale. Not because some government program enabled it. Because the tools got cheap enough that a person with depth and judgment doesn't need a team anymore. I know this because I'm living it. It's lonely sometimes. But it works.
That's not utopia. It's messy, unequal, and full of transition pain. But it's not feudalism either. It's something new. Something we don't have a word for yet. The economists will name it in 2035 and win prizes for describing what already happened.
The ladder is gone. What replaces it isn't another ladder. It's a different geometry entirely. And the people who stop looking for the next rung and start exploring the new shape will be the ones who find their footing.
The rest will keep staring at where the ladder used to be, waiting for someone to rebuild it.
Nobody's rebuilding it.
Ashutosh Makwana
10+ years engineering. AI-native since 2022. Building things that think.
