AI EMERGENCE 19 March 2026

When AI Changes Who You Are

The quiet crisis in workplaces isn't about tools. It's about identity.

When AI Changes Who You Are

You used to be the person who knew the codebase. The one who could draft a strategy doc faster than anyone in the room. The designer whose taste was the product’s taste.

Then the tools arrived.

Familiar Ground

You know the surface story. Companies roll out AI tools. Teams are told to adopt them. Some people take to it instantly. Others drag their feet. Leadership calls it a “skills gap” and prescribes training. A lunch-and-learn here. A Slack channel full of tips there. Check the box. Move on.

This is how most organisations frame AI adoption: a tooling challenge. Learn the new thing. Keep up. Or fall behind.

There is a quieter version of this story that nobody is telling.

Counter-Signal

Rangaprabhu Parthasarathy is a Director of Product at Meta who has shipped Kindle e-readers, the first Echo Dot, Oculus Quest 1 and 2, and Llama models. He is not behind the curve. He has near-unlimited access to AI tools and the freedom to experiment. If anyone should feel comfortable, it is him.

And yet, here is what he wrote:

“What’s happening in workplaces right now isn’t just a shift in tools. It’s a shift in identity. People aren’t just learning new software. They’re renegotiating their sense of what they’re good at, how they contribute, and where they fit.”

Read that again.

The disorientation you feel is not a lack of training. It is not a skills gap. It is not that you are too slow or too old or too resistant. It is that the things you built your professional identity around, the skills that made you you at work, are being repriced in real time. And that repricing is happening faster than any human can comfortably absorb.

⚛️ The Fusion

Three ideas crash here.

Identity renegotiation. Prabhu names what most adoption frameworks miss entirely. When your tools change, you do not just learn new software. You recalibrate who you are. The engineer who prided herself on writing elegant code now watches an AI produce equivalent output in seconds. The product manager whose edge was cross-functional translation discovers that half the translation layer has evaporated. The designer whose eye was the differentiator finds the first draft is now machine-generated. These are not productivity problems. They are existential ones.

Liminality. Anthropologists have a word for the space between what you were and what you have not yet become: liminal (from Latin limen, threshold). In traditional rites of passage, liminality is the dangerous middle, where the old identity has dissolved but the new one has not yet formed. Every workplace adopting AI is in a collective liminal state right now. The old roles have cracked open. The new ones have not solidified. And in that gap, people feel something that training cannot fix: a loss of ground.

Organisational Soul. Living systems theory, the view that organisations are not machines but living organisms, identifies three layers: Mind (cognition, strategy), Body (infrastructure, capability), and Soul (identity, purpose, values). Most AI adoption programmes invest heavily in Body (tools, platforms) and Mind (training, process redesign). Almost none invest in Soul: the collective identity that tells people who they are, why they matter, and what the organisation values about them as humans.

What if you could see AI adoption not as a training problem, but as a transition ritual that organisations are running without the one ingredient that makes rituals work: a container for identity?

The New Pattern

Skills-Gap FramingIdentity-Renegotiation Framing
”Learn the tools""Rediscover what you bring”
Training fixes the problemConditions enable the transition
Anxiety = resistanceAnxiety = healthy response to genuine loss
Speed of adoption = successDepth of integration = success
People must “keep up”Organisations must hold the space
Measures tool usageMeasures identity coherence

The organisations that navigate this well will not be the fastest adopters. They will be the ones that created the most human conditions for transition. Protected time for learning, not squeezed into the margins. Environments where asking basic questions is not embarrassing. Peer learning, small experiments, low-stakes exploration. These are not luxuries. They are the Soul-layer infrastructure that makes identity renegotiation survivable.

Prabhu’s own practice illustrates this. He built an AI Chief of Staff using Claude Code, PARA method, and a root identity file that defines who he is, how he works, and what the agent should prioritise. His most surprising discovery? “Ingestion matters more than generation.” The real leverage was not the AI writing for him. It was the AI understanding him.

That is the pattern. The tool does not replace you. It reveals what about you was always more than the tool.

The Open Question

If your organisation is asking “How do we get people to adopt AI faster?”, you are answering the wrong question.

The real one is: what happens to people’s sense of self when the skills they built their identity on become commoditised overnight? And what are you, as a leader, doing to hold that space open long enough for something new to emerge?


This fusion emerged from a deep STEAL on Rangaprabhu Parthasarathy’s platform (March 2026). The research that grounded it lives in persons/rangaprabhu-parthasarathy.

ai-adoptionemergenceliving_organizationsmbs_frameworkpersonal_sovereignty