There's a particular kind of vertigo that comes not from failing, but from watching the rules of success change while you're still playing the game.
That's where a lot of high-achievers are right now. Not struggling. Not underperforming. But feeling, for the first time in their careers, like the ground beneath their expertise is shifting in ways they didn't authorize and can't fully see. They've built something real: knowledge, reputation, a body of work. And now they're watching a technology reshape what that's worth, without asking anyone's permission.
The response, for most people, is to move fast. Upskill. Learn the tools. Signal fluency. Stay visible.
That's not wrong. But it's not enough — and for some people, it's a distraction from the more important work.
The Upskilling Trap
There's a version of adapting to AI that is really just anxiety in productivity clothing.
You take the courses. You learn the platforms. You put the tools in your workflow and post about it thoughtfully. And underneath all of it is a quiet, urgent question you haven't quite said out loud: Am I doing enough to not become irrelevant?
That question is understandable. But it's also worth examining, because it's pointing in the wrong direction.
Upskilling from fear optimizes for the wrong thing. It treats the AI era as a technical problem — a skills gap to close, a knowledge base to update — when the deeper shift is something else entirely. The people who are navigating this well aren't the ones who moved fastest to learn the tools. They're the ones who got clear, quickly, on what the tools cannot do. And then they invested deliberately in that.
What I See from the Inside
I run a boutique AI consultancy alongside my coaching practice — four former BCG colleagues with our hands directly in this space, watching how organizations are actually integrating these technologies, not just how they're talking about it in press releases and social media posts.
What we see is this: the disruption is real, it's uneven, and it's moving faster than most organizations are honestly prepared to admit.
But here's what's also true — and what rarely makes it into the public conversation: the most acute anxiety isn't in the people who know nothing about AI. It's in the people who know enough to understand what's at stake but aren't yet sure where they stand.
Creative Directors are a vivid example of this. They understand, better than most, what AI-generated content looks like at scale. They can see through it — the subtle flatness, the missing tension, the brand inconsistency that emerges when you remove the human eye from the equation. And they're working hard right now to articulate something they've always done intuitively: that their value isn't their ability to produce. It's their ability to judge. To protect a brand from itself. To know when something is technically competent and strategically wrong.
That articulation matters enormously. Because the leaders who can name what they bring — specifically, credibly, beyond just "the human touch", are the ones who will hold ground in this shift and shape what comes next. The ones who can't name it will get crowded out, not by AI, but by peers who were more precise about their own value. That's true right now. And given how fast this is moving, right now is what matters.
The Real Shift: From Knowledge to Judgment
Here's the reframe that I keep coming back to, in my consultancy work and in my coaching practice alike:
AI is very good at knowledge. It is not good at judgment.
It can synthesize information at a scale no human can match. It can generate options, surface patterns, produce competent first drafts across almost any domain. What it cannot do is carry the weight of a decision. It cannot read the room. It cannot hold the tension between what is technically correct and what is right for this organization, this moment, this particular set of human stakes.
That's what judgment is. And judgment is built from something AI cannot replicate: lived experience, pattern recognition earned through failure, the scar tissue of having been wrong and having learned from it.
Your knowledge base was always going to become commoditized eventually. Every cycle of technology has moved what was once specialist knowledge into the hands of more people. The professionals who thrived through every prior shift weren't the ones who hoarded knowledge. They were the ones who developed the wisdom to apply it.
This moment is different. It's not just another technology cycle – this is a paradigm shift, and it's moving at a pace that outstrips most people's ability to fully absorb what's changing. That still holds. The speed is just unforgiving in a way prior shifts were not.
Three Questions Worth Sitting With
If you're navigating your professional identity in the middle of this shift, these are the questions I'd bring to the work:
1. What would only you know to do — even if AI gave you all the information? Not what you know. What you do with what you know. The decisions you make, the things you weigh, the calls you get right because of who you are and what you've been through. That's the inventory worth taking.
2. Are you upskilling toward your edge, or away from your fear? Both can produce activity. Only one produces positioning. If you're learning AI tools because they amplify what you already do distinctively well, that's leverage. If you're learning them because you're scared of being left behind, you'll end up with fluency in tools and no clearer story about your value.
3. Can you articulate what the human version of your work protects? Not in abstract terms. Specifically. What gets lost when AI takes over the part you do? What goes wrong? What does your judgment catch that a model would miss? If you can answer that with specificity and confidence, you have a positioning statement that will hold — not forever, nothing does in this environment, but long enough to matter and build from. If you can't, that's the work.
This Is a Threshold, Not a Crisis
Every major technology shift has felt, from the inside, like the ground was disappearing. It wasn't. It was changing shape.
What tends to disappear in these moments are the roles that were always more about process than judgment — the jobs that were really just the management of information. Those do get disrupted, and honestly, many of them should be. They were never the most interesting or valuable application of human intelligence anyway.
What doesn't disappear — what can't — is the capacity to lead, to decide, to build trust, to read what's true beneath what's being said, to hold a long view when everyone around you is reacting to the short one.
That capacity is what your career has been developing all along. The AI era doesn't threaten it. It clarifies it.
The question worth asking isn't how do I survive this shift? It's: What does this shift reveal about what I was always actually good at?
That's not a comfortable question. But it's the right one. And it's the kind of question that, in my experience, only becomes available to you once you stop running from the disruption and start getting really clear about what you bring to it.
I sit at an unusual intersection of this shift — as a coach working with leaders navigating it personally, and as an AI consultant watching it unfold at the organizational level in real time. That dual perspective shapes everything I bring to this work. If you're trying to find your footing in the middle of this transformation — not just professionally, but in terms of who you are and what you're building — that's exactly the conversation I'm here for.