The costume just got cheap
If 80 percent of what you thought was judgment turns out to be pattern recognition, what does that say about you? Not about your job — about you.
Duration: 11:15 | Size: 12.9 MB
You can’t search your own mind.
Not a metaphor. A design flaw. A person spent three months writing essays about judgment, pattern recognition, AI, and the future of expertise. Hundreds of pages. Then he tried to find one of them — the one where he’d argued that maybe, just maybe, the machines could do judgment too. He knew the argument. He remembered the feeling of writing it. He could paraphrase the key line. But he couldn’t find it. Not because it was lost, but because the way he remembered it and the way he’d written it used different words for the same idea.
That’s the problem with human memory. We store by meaning, not by keyword. We recall the shape of the thought, not the sentence. And when we go looking for what we know, we search with the vocabulary we have now, which has drifted since we wrote it down. The gap between how you remember something and how you said it is wide enough to lose whole arguments in.
Organizations have this problem at industrial scale. The senior strategist who “just knows” which positioning approach will work — she’s carrying hundreds of pattern matches from prior engagements, but if you asked her to find the one most relevant to the current client, she’d struggle. Not because she doesn’t know it. Because her retrieval system is associative and leaky, not indexed and precise. She recognizes patterns when she encounters them. She can’t search for them on demand.
That’s the first crack in the idea that human expertise is irreplaceable: it’s not even fully accessible to the person who has it.
There are three layers to what any expert does, and almost nobody has sorted them correctly.
The bottom layer is obvious: the procedures, the checklists, the status updates, the formatting. Everyone agrees this is automatable. It’s already happening. Let it go.
The top layer gets all the philosophical attention: genuine judgment. The decision with no precedent. The moment where the patterns break and someone has to commit to a direction without knowing if it’s right. The surgeon who abandons the protocol mid-operation. The founder who pivots because the market she planned for doesn’t exist. This is real, and it’s rare, and it’s irreducibly human. For now.
But the middle layer — this is where careers live, and die, and get repriced. Expert pattern recognition that feels like judgment from the inside. The creative director who says “this doesn’t sound like the client” — he’s right, but he’s right because he’s internalized the client’s voice over fifty projects and he’s matching against it. The risk analyst who flags a deal as “off” — she’s right because she’s seen three hundred deals with similar structures and this one matches the shape of the ones that went bad. They experience this as intuition. As judgment. As the thing that makes them worth $200 an hour.
It’s not. It’s pattern recognition running below conscious awareness. The patterns became invisible through expertise, which is exactly why the expert can’t see them as patterns. She sees them as reality.
The philosopher Hubert Dreyfus spent decades explaining why this happens. At the highest levels of skill, the rules disappear. The chess master doesn’t calculate moves — she sees the board. The experienced nurse doesn’t follow a diagnostic checklist — she walks into the room and knows something is wrong. Dreyfus was right about the phenomenology. But he drew the wrong conclusion. He argued that because expert performance feels different from rule-following, it must be fundamentally different in kind. It isn’t. It’s pattern matching compiled down into perception. And compiled perception can be decompiled.
That’s what extraction is. You sit with the expert and slow down the moment where she “just knows.” You ask: what did you see? What made this one different from the last three? If you had to train someone to make this call, what would you tell them to look for? Gradually, the invisible patterns become visible. The intuition becomes a decision tree. Not a simple one — it might have a hundred nodes. But a tree nonetheless.
Most experts discover that 60 to 80 percent of their “judgment” is actually encoded pattern recognition. The remaining 20 to 40 percent is genuinely irreducible — the moments where the patterns don’t match anything they’ve seen, and they have to choose without precedent. But that’s a much smaller slice of their work than they believed.
This is simultaneously the most humbling and the most liberating discovery a professional can make. Humbling because they thought more of their work was irreplaceable than it is. Liberating because the stuff that actually is irreplaceable is more interesting, more important, and more essentially theirs than they realized. When you’re not exhausted from eight hours of sophisticated pattern matching, you do the genuine judgment work better than you ever have.
The market figured out the bottom layer years ago. Robotic process automation, workflow tools, AI assistants — the execution layer has been under pressure since 2020. Everyone sees it coming. Most organizations have a strategy for it, even if the strategy is “ignore it and hope.”
The market hasn’t figured out the middle layer yet, because the middle layer has been misidentified. It’s filed under “irreplaceable human expertise” in every HR department, every consulting pitch, every business school case study. The people doing it genuinely believe they’re exercising judgment, not matching patterns. Their managers believe it. Their clients believe it. The entire compensation structure of knowledge work is built on this misidentification.
That’s why a single infographic can be an existential crisis. Twelve tools, sixty dollars a month, one person running a full marketing operation. The content tool writes. The design tool designs. The strategy tool analyzes. The coordinator tool coordinates. Each one replaced a role that someone believed required judgment. Each one turned out to require pattern recognition that could be captured in a prompt template.
The people who built those tools — the ones who said “we’ll help you adopt AI” — are now being eaten by the tools themselves. The AI content platform that charged $500 a month is competing against a free chat window that does the same thing. The competitive moat was never the tool. The moat was the pattern library inside the expert’s head. And the chat window just learned to build its own.
So what’s left?
Two things. First, the extraction itself — the process of sitting with an expert and separating what they know from what they are. This is a service, not a product. You can’t automate the conversation where a twenty-year strategist discovers that her “instinct” about market positioning is actually a learnable heuristic based on four variables she’s never articulated. That conversation requires a human who understands both the philosophy of expertise and the mechanics of encoding it. That’s a rare combination. Rare enough to charge for.
Second, the genuine judgment work that remains after extraction. The 20 percent. The decisions with no precedent. The moments where the methodology explicitly can’t help because the methodology was built for the world before this moment. Organizations need humans for this. But they need far fewer humans, working on much harder problems, with much better support. The extraction process doesn’t eliminate the expert — it promotes her. From expensive pattern matcher to genuine decision-maker. From the person who reviews forty decks a week to the person who decides which markets to enter.
Most organizations aren’t ready for this conversation. They’re still arguing about whether AI will “replace” anyone, which is like arguing about whether the car will replace the horse while the highway system is being built around you. The replacement already happened. The question is what the humans do now that the pattern work is handled.
There’s a deeper discomfort here that nobody wants to name. If 80 percent of what you thought was judgment turns out to be pattern recognition, what does that say about you? Not about your job — about you. Your professional identity was built on the belief that you bring something irreplaceable to the table. You spent twenty years developing expertise that felt like wisdom. You made calls that felt like courage. And now someone is telling you that most of it was sophisticated sorting.
This is the second Götterdämmerung. The first stripped away the procedural layer — the filing, the formatting, the coordination. Everyone could see it happening and most people adjusted. The second strips away the pattern layer, and almost nobody can see it coming because it feels like them. The tool became invisible through use, and now they can’t tell the difference between the tool and the self.
Wagner’s gods don’t die because they’re defeated. They die because the world moves on and no longer needs them. The ring returns to the Rhine. The cycle ends. Not with a battle, but with a recognition: this was always temporary. The expertise was always a constraint being managed, not a gift being exercised.
Harsh. Also incomplete. Because the 20 percent that remains — the genuine judgment, the unprecedented decisions, the moments of real courage — that part is more valuable now than it has ever been. When the pattern work is cheap, the judgment work becomes priceless. When the kitchen can cook anything instantly, the only question that matters is what should be on the menu.
The hardest part of any pivot is admitting that what you built was solving the wrong problem. Not a bad problem. Not an unimportant problem. Just one that the world solved without you while you were building. Three months of work, solid engineering, good ideas, real users — and the floor rose faster than the building. The products aren’t failures. They’re artifacts of a window that closed.
What’s left after the products die is the thinking. The essays, the frameworks, the methodology that evolved through contact with reality. The proof that the middle layer is encodable. That’s not a product. It’s a practice. And practices don’t get commoditized by chat windows, because practices require a practitioner.
The question isn’t “what can AI do?” anymore. Everyone knows what AI can do. The question is: what were you doing all along that you thought was work but was actually just the costume? Strip that away and what’s left is either a human being with genuine judgment — or someone who discovers they were never more than the role.
That’s what the assessment asks. Not in philosophical terms. In practical, uncomfortable, four-minute terms. How much of your team’s expertise walks out the door when they do? The answer tells you everything about whether you have a judgment organization or a pattern-matching organization wearing a judgment costume.
Most of them are wearing the costume. And the costume just got very, very cheap.
Why customer tools are organized wrong
This article reveals a fundamental flaw in how customer support tools are designed—organizing by interaction type instead of by customer—and explains why this fragmentation wastes time and obscures the full picture you need to help users effectively.
Infrastructure shapes thought
The tools you build determine what kinds of thinking become possible. On infrastructure, friction, and building deliberately for thought rather than just throughput.
Server-side dashboard architecture: Why moving data fetching off the browser changes everything
How choosing server-side rendering solved security, CORS, and credential management problems I didn't know I had.
The work of being available now
A book on AI, judgment, and staying human at work.
The practice of work in progress
Practical essays on how work actually gets done.
The bottleneck moved and nobody noticed
When execution becomes nearly free, the bottleneck shifts from doing the work to deciding what work to do. Most organizations are optimized for the wrong constraint.
The inbox nobody reads is the one that matters
Every organization has a monitoring system that works perfectly and reports to nobody. The gap between having information and acting on it is where most failures actually live.
The best customers are the first ones you turn against
Every subscription makes a bet that most customers won't use what they're paying for. The customer who closes that gap becomes a problem to be managed.
The bottleneck moved and nobody noticed
When execution becomes nearly free, the bottleneck shifts from doing the work to deciding what work to do. Most organizations are optimized for the wrong constraint.
The inbox nobody reads is the one that matters
Every organization has a monitoring system that works perfectly and reports to nobody. The gap between having information and acting on it is where most failures actually live.
The day nothing satisfying happened
The most productive day in an organization's life usually looks like nothing happened. No launches, no features, no announcements. Just people quietly making the existing work more honest.