Skip to main content
Paul Welty, PhD AI, WORK, AND STAYING HUMAN

· work · AI · philosophy

You were trained to suppress yourself

Organizations didn't accidentally reward the machine-self. They engineered it. And you cooperated because it worked—until now.

You were trained to suppress the most valuable parts of yourself. Not by accident. By design.

Organizations needed processors, not meaning-makers. They needed people who followed processes, hit metrics, produced outputs on schedule. Judgment was expensive. Creativity was unpredictable. Questions slowed things down.

So work rewarded the version of you that showed up on time, stayed quiet, and did what you were told. The machine-self.

Here’s what got suppressed: curiosity about whether the work mattered. Judgment about whether the answer was right, not just whether you got the right answer. Creativity that didn’t fit the template. Reflection that might reveal the whole system was pointed in the wrong direction.

These aren’t small things. They’re what make you human rather than programmable.

I saw this pattern everywhere when I consulted for organizations. Smart people, capable people, who had learned to turn off entire dimensions of themselves at the office door. They’d arrive with questions, insights, concerns about direction. And they’d suppress all of it because the system didn’t reward any of it.

The suppression was functional. It worked. For 200 years, being machine-like was how you succeeded. You got promoted for reliability, not judgment. For execution speed, not discernment. For throughput, not wisdom.

And now AI can do all of that better than you can.

The skills you spent decades developing—the ones that got rewarded, that got you promoted, that proved you were valuable—are exactly the ones getting automated. Because they were always machine skills. You were just playing the role of the machine until the actual machine showed up.

This is what people deny. Not that AI is coming. Not that jobs will change. But that the version of yourself you’ve been performing at work was already diminished. That the suppression happened to you, and you cooperated with it, and it cost you something real.

The machine-self isn’t something you chose. It’s what remained after the filtering process. You showed up as your full self—curious, questioning, creative, discerning—and the workplace systematically filtered out every dimension that didn’t produce predictable output.

You learned quickly. Ask too many questions: flagged as difficult. Suggest a different approach: not a team player. Point out that the metrics are measuring the wrong thing: not strategic thinking, just resistance.

So you stopped. You filtered yourself. You became what the system needed you to be.

And it worked. Until the system automated the exact capabilities it had trained you to develop.

Here’s the uncomfortable part: you can’t go back and reclaim those suppressed capacities without admitting they were suppressed in the first place. Without acknowledging that the professional success you built came from making yourself smaller. More compliant. More machine-like.

That’s the denial. Not “my job might be automated.” But “I was already being treated as automatable, and I accepted it because it was rewarded.”

The question isn’t whether AI will replace you. The question is whether the version of you that work created was ever actually you at all. Whether there’s something underneath the machine-self that can’t be automated because it was never mechanical to begin with.

When I taught at Emory, I’d watch graduate students arrive with curiosity and leave with caution. They’d start asking real questions—questions that challenged the framing, that exposed contradictions, that required sitting with uncertainty. And then they’d learn that academic success meant playing the game: cite the right people, use the right language, don’t challenge too directly.

The ones who succeeded learned to suppress what made them good thinkers. Because the system rewarded compliance, not courage.

The same thing happens in every organization. You learn what gets rewarded. You perform that. Everything else gets filed away as “not appropriate for work.”

The denial is thinking this was just practical adaptation. That there was no cost. That you can perform the machine-self for decades and still have access to the full self when you need it.

You can’t. The capacities you don’t use atrophy. The questions you stop asking stop occurring to you. The judgment you don’t exercise becomes uncertain.

So when you feel threatened by AI, what you’re actually feeling is the recognition that you built your professional identity on skills that were always automatable. That the system trained you to be replaceable and called it career development.

That’s what we deny. That the suppression happened. That it cost something. That the professional success came at the expense of what actually makes you irreplaceable.

AI didn’t create this problem. It exposed it. The machine-self was always a compromise. It just worked well enough that you could ignore the cost.

Now you can’t.

The work isn’t learning to use AI tools. The work is reclaiming what you suppressed to succeed in systems that didn’t value it. Judgment. Discernment. Creativity. The ability to ask whether the work matters, not just whether you did it correctly.

Those capacities are still there. But they’re not going to come back just because you want them to. You have to practice them. Strengthen them. Stop filtering yourself to fit the machine’s requirements.

That’s the work. Not staying relevant. Not learning new tools. Becoming actually human again instead of pretending the machine-self was you all along.

This is what The Work of Being is about. Not tips for the AI age. The uncomfortable recognition that you’ve been performing a diminished version of yourself for years, and the path to reclaiming what was suppressed.

If that’s landing, the book is here: The Work of Being: Staying Human in the Age of AI

This essay first appeared in Philosopher at Large, a weekly newsletter on work, learning, and judgment.

Why customer tools are organized wrong

This article reveals a fundamental flaw in how customer support tools are designed—organizing by interaction type instead of by customer—and explains why this fragmentation wastes time and obscures the full picture you need to help users effectively.

Infrastructure shapes thought

The tools you build determine what kinds of thinking become possible. On infrastructure, friction, and building deliberately for thought rather than just throughput.

Server-side dashboard architecture: Why moving data fetching off the browser changes everything

How choosing server-side rendering solved security, CORS, and credential management problems I didn't know I had.

The work of being available now

A book on AI, judgment, and staying human at work.

The practice of work in progress

Practical essays on how work actually gets done.

Silence by design

Most systems have more suppression than their owners realize. It gets installed for good reasons. The cost accumulates slowly, in the form of systems you can't operate because you've removed the signals that would let you understand them.

Designed to learn, built to ignore

The most dangerous organizational failures don't throw errors. They look fine, return results, and quietly stay frozen at the moment of their creation.

The variable that was never wired in

The gap between having a solution and using a solution is one of the most persistent failure modes in organizations. You see the escaped variable. You see the risk register. You assume the work is done.

You can’t skip the hard part

Reskilling won't save you. Frameworks won't save you. The work of becoming human again is personal, uncomfortable, and has no shortcut.

Busy was always avoidance

Staying busy kept you from noticing where you were. AI didn't create the abyss—it just forced you to look.

Manual fluency is the prerequisite for agent supervision

You cannot responsibly automate what you cannot do manually. AI agents speed up work for people who already know how to do it. They do not replace the need to learn the work in the first place.