
There Will Be No Jobs
Posted February 09, 2026
Chris Campbell
Jobs are fading.
Not crashing all at once, not vanishing in a single headline-grabbing layoff wave.
Thinning.
Entry-level roles. Junior white-collar work. The bottom rungs of ladders that used to exist almost by default.
At the same time, productivity is inching upward.
Across tech, media, finance, and parts of manufacturing, companies are producing more output per employee while headcount growth stalls or reverses.
Most commentary treats this as apocalyptic.
Institutions like the World Economic Forum warn of “millions of jobs lost to AI,” major outlets talk about the “end of white-collar work,” and tech leaders speculate openly about a jobless future or universal basic income.
The language assumes a cliff—mass unemployment, social collapse…
But the evidence points to something narrower and far more disruptive: the idea of what it means to have a “job” changing radically.
What a “Job” Really Is
The “job” as we know it today is not a timeless feature of human society.
It emerged with the Industrial Age as a coordination solution to factories, machines, and clock-time.
Before industrialization, work usually lived inside households, land, guilds, and seasons, not inside a single, titled job with fixed hours and a weekly wage.
Work was situational.
You moved between tasks, seasons, and skills, sometimes within the same day. Identity came from craft, family, or place, not from employment.
The industrial system needed something different.
Machines were expensive, coordination was hard, and time had to be synchronized.
The job emerged as a social technology to solve that problem. It bundled labor into standardized roles, measured by hours, paid by wages, and repeatable across workers.
Over time, that bundle picked up identity, status, and a life path.
“What do you do?” became a primary way of locating someone in society.
Right now, technology is cracking the industrial container.
The Polanyi Paradox
Michael Polanyi was a scientist-turned-philosopher who obsessed over a simple observation: people do sophisticated things every day without being able to explain how they do them.
Doctors diagnose. Craftsmen sense flaws. Experienced operators know when something feels off.
The knowledge is real, useful, and hard to put into rules.
That insight became Polanyi’s Paradox: humans know more than they can tell.
For a long time, that tacit layer protected many jobs. Machines needed instructions. Humans couldn’t fully supply them.
AI changes the mechanics.
It learns patterns directly, without explanation. Tasks once wrapped in intuition become reproducible. Roles built around those tasks thin out.
That’s the part most talk about in apocalyptic terms.
The deeper point runs the other direction.
Tacit knowledge isn’t weak. Some of it is untransmittable.
Pattern-matching gets you capability. It doesn’t get you progress. Progress comes from knowing why things work, and when they should be rethought.
That’s the core claim David Deutsch makes in his (incredible) book The Beginning of Infinity.
The world doesn’t run on labor or capital. It runs on explanations.
Every breakthrough clears old problems and exposes new ones. Better explanations raise standards. Complexity grows. New failure points appear. So do new opportunities.
Machines execute known steps faster than humans ever could. They don’t choose which explanations are worth pursuing or what improvement even means.
That open-ended work never finishes.
It just moves.
Human value doesn’t vanish—it relocates, upward and outward, as the idea of a “job” quietly gives way to something else.
What is that something else? That’s the question.
And we’re diving deep.
Tomorrow, we follow the trail—tracking where work is already unbundling, and how that shift is reshaping markets, margins, and the next wave of investable opportunities.
More to come.
