AI, career insecurity and the rise of quiet cracking
- David Yates

- 4 days ago
- 4 min read
Updated: 2 days ago
The impact of AI has gone far beyond the initial promise of decreasing workloads and increasing productivity. In some sectors, it is raising questions about whether entire roles and career paths are still needed at all.
For some people, AI has become an existential threat. They no longer have to simple worry about performing well in their role but also about whether that role will even be necessary in the very near future.

The existential threat from AI
The rapid expansion of AI over the last few years has introduced a new kind of threat to information workers and intensified an existing form of stress.
In many roles, people used to be able to rely on a relatively stable relationship between effort and value. If they performed well, developed their skills and contributed consistently, their role remained meaningful. That is no longer as clear or as stable as it once was.
AI is not simply increasing efficiency. It is beginning to change what counts as valuable work. Tasks that once required judgement, experience or technical skill can now be completed faster and at scale, often with less human input.
For many, the question is no longer simply how to perform well in the role they have. It is whether that role, and their place within it, will still exist in the same way in the near future.
Uncertainty erodes performance
When faced with this kind of uncertainty, most people do not disengage. They lean in. They work harder, stay later and try to keep up with rising demands, often taking personal responsibility for holding things together even when the system itself is unclear.
This is a rational response. Increasing effort is one of the few levers people feel they still control. For a while, it works. Output is maintained and performance appears stable from the outside.
But the cost of that effort begins to accumulate. When uncertainty cannot be resolved or influenced, it does not sit passively in the background. It continues to draw on attention. The mind remains active, scanning for signals, trying to interpret what matters and anticipating what might change next.
Over time, that reduces the capacity available for the work itself. Decisions take longer, feel heavier and are more easily disrupted. Judgement becomes less fluid and more effortful.
Quiet cracking
This is where quiet cracking begins.
It describes a state in which people remain outwardly functional while part of their cognitive capacity has been displaced by unresolved uncertainty. Attention is still being used, but not only for the work itself. Some of it is being drawn into monitoring change, interpreting signals, anticipating loss and trying to judge where future value still lies.
This form of cognitive displacement is easy to miss. People are still working, still delivering and still showing up. But they are doing so with reduced capacity, and that reduction is often invisible until it begins to affect outcomes.
As this state persists, behaviour starts to change. People become more selective about where they invest effort. They narrow their focus to what they can control and step back from work that feels ambiguous, exposed or unlikely to matter. Initiative drops, contribution becomes more measured and the willingness to take on additional cognitive load reduces.
From the outside, this can look like disengagement. In reality, it is often a response to sustained uncertainty and reduced cognitive capacity. What is being interpreted as a motivation problem is frequently a system signal.
The organisational cost of career insecurity
If AI is creating uncertainty about future relevance, the response cannot be limited to telling people to adapt faster or become more resilient.
When people are no longer sure that effort, judgement and experience will protect their future, behaviour changes. They become more cautious about where they invest energy. They protect themselves. They contribute more selectively. In time, this affects challenge, initiative, learning and decision making.
That matters because organisations may misread the pattern. What looks like lower engagement or weaker motivation may in fact be a rational response to sustained career insecurity. The system is generating the behaviour it later judges.
If organisations want people to keep contributing at a high level, they need to do more than promote AI capability. They need to reduce unnecessary uncertainty. That means being clearer about what is changing, what will still require human judgement and where people will continue to have value.
Without that clarity, the gains from AI may come with a hidden cost. Not immediate collapse, but a slower erosion of confidence, contribution and performance in the people expected to carry the transition.
If this article reflects what is happening in your organisation, the pattern is unlikely to be isolated.
Breakdowns in communication, weak challenge, poor decisions under pressure and slow learning are often connected. They usually point to deeper problems in how work is organised, especially under stress.
At Learn Resilience Now, we help people, teams and leaders understand those patterns and respond more effectively through better decision making, clearer communication and stronger performance under pressure.

This practical one-day course helps people understand what stress does to the body and mind, recognise their own patterns under pressure and build a stronger personal toolkit for coping, recovery and sustained performance.
And learn about our other training courses and workshops here...
