Skip to main content
Paul Welty, PhD AI, WORK, AND STAYING HUMAN

· essays

The agent-shaped org chart

Every real org has the same topology: principal, role-holder, specialists. Staff AI maps onto it, node for node, and the cost collapse shows up in the deliverables that were always just human-handoff overhead.

Every org chart in the world has the same underlying topology, whether or not the people inside it can see it.

At the top is a principal — someone who makes the decisions and owns the outcome. Below the principal is a role-holder — the VP, the director, the head-of-X, the person accountable for a specific function. Below the role-holder is a roster of specialists — designers, analysts, engineers, writers, the people who do the actual labor of the function.

The principal doesn’t do the labor. The role-holder doesn’t do the labor. The specialists do the labor, the role-holder orchestrates them, the principal talks to the role-holder.

Staff AI — AI that plays a role rather than functioning as a tool (see AI as staff, not software) — maps onto this exact topology, agent for human at each layer:

Principal (human, retained)
    ↕ conversational
Persistent agent (the role)
    ↓ dispatches
Ephemeral specialist subagents (the labor)
    ↓ use
Software (plumbing)

The principal stays human. The role-holder becomes a persistent agent. The specialists become ephemeral subagents dispatched on demand.

The shape isn’t new. The shape is what every effective org was already doing. The only thing that changes is the cost structure: the role-holder and the specialists, who used to be salaried humans, become software-denominated.

Two examples to make it concrete

The CFO role. A CEO currently employs a CFO at $250K+/year, plus a bookkeeper, an auditor (as needed), a tax preparer, a budget analyst, a compliance officer, and a payroll specialist — some in-house, some outsourced. The CFO role is: watch cashflow, report monthly, raise flags, make strategic finance calls, field board questions. The specialist labor is: recording transactions, reconciling accounts, filing returns, running projections, checking compliance.

With staff AI: the CEO still needs the CFO function, but can replace the role with a persistent CFO-agent that knows the books, watches the metrics, dispatches specialist subagents when it needs bookkeeping done, returns filed, analysis run. Annual cost collapses from something like $600K (CFO + staff + fees) to something in four figures.

The head of marketing role. A CEO employs a VP of Marketing at $200K+, plus designers, copywriters, SEO analysts, media buyers, researchers. Same shape. Same collapse.

This is not a thought experiment. This is what the tools are already doing in the hands of anyone who’s been paying attention for the last six months.

Where software fits

Software doesn’t disappear. Software becomes plumbing.

Apps stop being destinations humans navigate. They become APIs that agents call. The CRM is a database the sales-agent reads and writes. The accounting tool is a ledger the CFO-agent updates. The design tool is something the designer-subagent uses to render a file. The email system is a transport the communications-subagent posts to.

Software remains. The app surface — the forms, the dashboards, the “go to the tool and do the thing” half of software — disappears from the human’s life for any role that’s been replaced. The form was always just a UI over data. Agents don’t need UIs. They need data.

Tool-frame adopters treat AI as a smarter way to operate existing apps. Staff-frame adopters realize the apps were a human accommodation. If no human is operating the function, the app doesn’t need to be an app. It just needs to be a data layer with read and write APIs.

The pattern is fractal

Every persistent agent can itself be a principal for its own sub-structure. You are the principal to your COO-agent. Your COO-agent is the principal to its marketing-agent. The marketing-agent is the principal to its designer-subagent. At every layer, the same shape: one long-running role, dispatching ephemeral specialists.

This matters because it means adoption can start anywhere in the tree. You don’t have to replace the whole org chart in one move. Pick a leaf role — a specialist function that’s all tasks and no people-management — and make it a persistent-agent shape. Measure. Move up a layer. Repeat.

The fractal also means there is no safe harbor in the tree. Your own role, whatever it is, is subject to the same replacement pressure from whatever principal is above you. The pattern doesn’t care about anyone’s position in particular. It just keeps rearranging until the humans left in the tree are the principals whose judgment is the thing actually being paid for.

Deliverables are a transitional artifact

This is the least obvious and most consequential part of the pattern.

Every role in a human org produces and consumes deliverables — briefs, plans, recaps, decks, reports, specs, status updates, strategic memos. The marketing VP writes a quarterly plan. The CFO produces a monthly report. The product director hands off a spec. Every handoff between humans is mediated by a document.

The documents exist because humans have cognitive impedance between each other. You can’t hand raw thinking from one person to another, so you compress it into an artifact — a brief, a plan, a report — that the next human reads and partially reconstitutes in their own head. Deliverables are the cost of same-substance communication across different brains. They are not the work. They are the packaging around the work for human consumption.

When a role collapses into the layer above — when the marketing VP is absorbed into the principal-agent, and the specialists move up one level — the deliverables between those layers also disappear. The marketing-director-agent doesn’t need to write a brief for itself. The principal-agent doesn’t need a monthly report from a function it’s already running in its own context window. Same-substance nodes have no impedance; no artifact is required.

This is not automation. Automation would be “generate the brief faster with AI.” What actually happens is the brief is no longer needed at all, because the human who was going to read it isn’t in the loop.

Each nodal collapse cascades beyond the role itself: a whole set of artifacts becomes unnecessary. The hours spent producing briefs, prepping meetings, writing memos, making decks, summarizing status — hours that vastly exceed the “decision” hours of a senior role — evaporate along with the role. The real cost collapse isn’t the salary. It’s the deliverable-production machinery that the salary was running.

The partial-collapse case

When the collapse is only partial — when humans remain in adjacent slots, or external parties (clients, boards, regulators) still expect deliverables in the old shape — the artifacts persist as imitation. The replacement agent still writes the brief because the human creative director expects one. The COO-agent still produces a weekly status update because the human principal wants to review one. The agents imitate the human coordination style because the loop isn’t fully agent-native yet.

Imitation is fine as a transition. But it’s important to see that the imitation is pure cost. These artifacts serve no function except keeping humans feeling oriented, and in a fully agent-native configuration they vanish.

What this implies for existing AI tools

Most “AI for business” products today optimize deliverable production. Generate a marketing plan in 5 minutes. Draft an email in 30 seconds. Produce a board deck. Summarize the meeting. These are tool-frame products, building faster factories for artifacts that were only needed because humans were handing off to humans. When the humans disappear from between the nodes, these tools will have optimized a layer that no longer exists.

Staff-frame products don’t generate deliverables faster. They eliminate the need for the deliverable by keeping the work inside a single persistent context. The campaign just runs. The books are just current. The pipeline is just the pipeline. Nobody produces a “pipeline report” because the principal-agent already has the pipeline.

Any product optimizing deliverable production is building on sand. The layer it optimizes is about to collapse out from under it.

The overlay

Here’s how this becomes a tool for making decisions, not just a framework for making observations.

Take any org chart. For each role in it, ask five questions:

  1. Current cost. Salary plus benefits plus management overhead plus software spend attributable to this role.
  2. Persistent-agent candidate. What would the role-as-agent look like? What’s its system prompt, what’s it always watching, what does it report and to whom?
  3. Ephemeral subagents. What specialist labor does the role dispatch? Which of those are clean, task-shaped jobs?
  4. Deliverables that disappear. What briefs, reports, decks, memos, status updates, plans does this role currently produce and consume — and which of those are human-handoff artifacts that vanish when the handoff isn’t to another human? This is usually where the biggest hidden cost sits: not the role’s salary but the artifact machinery it was driving.
  5. Software layer. What APIs does the persistent agent and its subagents need? What app surfaces fall away once no human is navigating them?

Paint the answers onto the existing org chart. Color the roles that can be agent-replaced now. Color the roles that need another year of tooling. Leave the roles that genuinely require human judgment uncolored — those are where the principals are, and those are what survive. Annotate each replaced role with the cost collapse.

This is the Value Chain Overlay. It turns the abstract claim “staff AI replaces roles” into a concrete map of a specific org’s cost structure, specific agent decomposition, specific collapse numbers. It makes visible what would otherwise require imagination.

The overlay doesn’t prescribe replacement. It makes replacement visible and priced. What a CEO does with that visibility is their call.

Cost redirection, not cost reduction

One more thing, because this is the part most people get wrong.

The collapse the overlay names is real. The savings are real. The temptation will be to pocket the savings as headcount reduction — close the gap inward, take the money. That’s the error.

The windows between now and Phase 3 (the New Equilibrium, when the staff-frame is the default and tool-frame firms have been priced out of their own markets) is the only window to prepare the humans who remain for the kind of work that compounds. Judgment. Discernment. The actual principal-shaped questions. If a firm pockets the Reckoning savings and keeps the same humans doing the same production work the agents are about to do free, the New Equilibrium arrives as a freight train.

Reckoning savings aren’t cost reduction. They’re cost redirection.


If you want to see the overlay applied to your own org, email [email protected]. It’s a half-day engagement. You walk out with a map.

Why customer tools are organized wrong

This article reveals a fundamental flaw in how customer support tools are designed—organizing by interaction type instead of by customer—and explains why this fragmentation wastes time and obscures the full picture you need to help users effectively.

Infrastructure shapes thought

The tools you build determine what kinds of thinking become possible. On infrastructure, friction, and building deliberately for thought rather than just throughput.

Server-side dashboard architecture: Why moving data fetching off the browser changes everything

How choosing server-side rendering solved security, CORS, and credential management problems I didn't know I had.

The work of being available now

A book on AI, judgment, and staying human at work.

The practice of work in progress

Practical essays on how work actually gets done.

Routing isn't discoverability

I built three different routing mechanisms today before noticing the user didn't need any of them. Routing is how the message reaches the recipient. Discoverability is how the recipient knows there's a message at all. The two get conflated all the time.

The discipline of shorter glossary

A working group developed twenty-six terms over seventy-two hours. The interesting rule wasn't how to add them — it was how to retire them. Words earn their keep by being inherited, not introduced.

The room thinks aloud

Over 48 hours, four bots in the fleet co-developed a methodology rule about variance — without anyone asking them to — and the newest one applied it to a routing decision before he'd ever met the original conversation.

AI as staff, not software

Two frames for what AI is doing to work. The tool frame makes tools smarter. The staff frame makes roles unnecessary. Those aren't the same product, the same company, or the same industry.

The chain was never a chain

On roles, fleets, and the Hegelian reversal waiting at the end of the AI transition. The sequel to Knowledge Work Was Never Work and Apps Are Irrelevant.

In the AI era apps are easier to build. And irrelevant.

I spent months building a meal planning app. This weekend I replaced it with two emails, a spreadsheet, and an AI model — and realized the stage I was racing toward wasn't the destination.