Learn to Notice: What You Miss Matters Under Pressure.
- David Yates

- Nov 13
- 6 min read
In high-stakes environments, it’s not knowledge or skill that fails first. It’s what we didn’t notice in time.
Situational awareness isn’t lost in a moment. It erodes quietly.
Not because someone didn’t care, or wasn’t qualified, or lacked commitment, but because their attention, under pressure, narrowed just enough to miss what mattered.
In complex work, we ask people to stay across multiple tasks, absorb new inputs, adjust for uncertainty and still remain alert to signals they may never have seen before.
We ask for readiness, even as the system they’re operating within continues to demand more than the human mind was designed to manage at once.
This is not about failure in the dramatic sense. It’s about the slow fading of clarity in environments that are fast, technical and unpredictable.
In those moments, situational awareness becomes not a trait, but a fragile achievement and it is often the first thing to go.
This piece explores why attention fails under pressure, how that failure becomes invisible until it’s too late, and what high-reliability industries have learned about protecting perception before it breaks.
Because if we want sustainable performance in energy, finance, and technology, we must start where all decision-making begins, with what we see.
Why Awareness Isn’t Always What We Think.
Situational awareness is often misunderstood as simple watchfulness, the idea that if someone is switched on, they’ll see what’s coming. But perception is not passive. It is an active and selective process, shaped by limits in attention, working memory, habit, and expectation.
When those limits are exceeded, awareness can degrade without any visible signal.
Endsley’s model of situational awareness defines three layers:
Level 1: Perception: noticing key elements in the environment
Level 2: Comprehension: understanding what they mean
Level 3: Projection: anticipating what happens next
The entire model collapses if Level 1 is compromised and Level 1 is where the human mind is most vulnerable under pressure.
Daniel Kahneman’s work on attention and cognitive bandwidth shows that our ability to process information is both limited and easily depleted.
Under moderate demand, attention sharpens.
But beyond a threshold, when workload is high, fatigue sets in, or information becomes fragmented, the brain defaults to shortcuts. It narrows its focus to what seems most relevant, and screens out the rest. This is how tunnel vision forms. Not from negligence, but from overload.
In such moments, even experienced professionals miss key cues. They stop scanning. They stop updating and slowly, their situational picture, the one they think they’re holding, becomes less accurate than they realise.
Inattentional Blindness and the Illusion of Competence.
Christopher Chabris and Daniel Simons demonstrated this fragility in their now-famous “Invisible Gorilla” study. Participants, asked to count basketball passes, failed to see a person in a gorilla suit walk directly through the scene.
It wasn’t that they weren’t paying attention, it was that their attention was already fully occupied.
In high-reliability settings, this effect is more than academic.
Pilots scanning for approach vectors miss a slowly decaying airspeed. Control room engineers hyper-focused on a pressure valve ignore a misaligned pump status.
Traders glued to headline feeds overlook a subtle shift in derivatives pricing. Each believed they were alert. Each missed what mattered.
This is what Simons called the illusion of attention: the belief that we see more than we actually do. The more experienced someone is, the more dangerous this illusion can become, because they’re more confident in their assumptions, and often less aware of their attentional gaps.
In environments like aviation, nuclear energy, or financial operations, this gap between what is and what we think is isn’t theoretical.
It’s operational risk.
The Limits of Working Memory and the Weight of Expectation.
Human attention is further constrained by the narrow capacity of working memory, the mental scratchpad we use to hold and manipulate information in real time.
Most people can hold between five and nine pieces of information before overload begins. When complexity rises or tasks overlap, items are dropped.
Not deliberately, but unavoidably.
This limitation is amplified by expectation. Our brains use mental shortcuts, known as schemas, to decide what to pay attention to.
These schemas are shaped by experience and assumption. They help us act quickly in familiar situations, but they also bias us to ignore the unfamiliar.
In practice, this means we often see what we expect to see, not what’s actually there.
A power grid operator who has seen hundreds of routine voltage fluctuations may unconsciously downplay a subtle indicator of system instability.
A software engineer accustomed to spurious alerts may tune out a real anomaly. A medical technician, believing a sensor is faulty, might dismiss a critical alarm.
These are not reckless errors. They are predictable outcomes of perceptual bias under pressure, expectation guides attention, and attention shapes reality.
How Situational Awareness Fails Quietly.
Loss of awareness rarely looks dramatic. More often, it’s silent and cumulative, a technician skips a visual scan to make up time. A pilot assumes the co-pilot is monitoring speed. A trader overlooks a risk position during end-of-day reconciliation. Each action, on its own, feels justifiable. Collectively, they form a blind spot.
In aviation, the phrase “loss of situational awareness” appears frequently in accident reports. It almost never refers to a single moment. Rather, it signals a gradual mismatch between what the crew thought was happening and what actually was.
Endsley described this as “mode confusion”: when the operator believes the system is in one state, but it is in another. The issue isn’t that the information wasn’t there, it’s that the human system wasn’t structured to notice it.
This is why high-reliability industries focus not just on training attention, but on designing for it.
Designing to Protect Attention.
One of the most transferable lessons from aviation, nuclear power, and medicine is this: we can’t rely on humans to notice everything. But we can design systems that help them notice what matters.
Key protective strategies include:
1. Scanning protocols
In aviation, pilots are trained to continuously scan instruments in a specific sequence. This “flow” reduces the chance of fixating on one element and missing others. Energy sector control rooms use similar visual sweeps across core systems. In finance, structured monitoring of dashboards or news feeds serves the same purpose, distributing cognitive effort across key variables.
2. Interface design
Effective displays present information in a way that aligns with cognitive priorities. In aircraft, critical data is often colour-coded and located centrally. In tech operations, visualisations highlight trends rather than raw numbers. The goal is not to show everything, but to guide attention to what’s changing or deviating from baseline.
3. Alarm management and salience
Too many alerts create noise. Too few miss events. High-reliability systems balance this through escalation thresholds, grouping of alarms, and multimodal cues (visual, auditory, tactile). The purpose is to support perception, not distract from it.
4. Checklists and standard operating procedures
These don’t replace thinking. They protect it. By externalising routine tasks, they reduce cognitive load and free up attention for dynamic assessment. When checklists are seen as enablers rather than constraints, they become critical anchors for awareness under pressure.
5. Culture and tempo
If a workplace culture rewards speed over thoughtfulness, situational awareness will degrade. Creating room for pause, through briefings, cross-checks, or just slow time, supports the recalibration of attention and restores perspective.
Cross-Industry Relevance.
These lessons are not exclusive to regulated sectors. In energy, finance, and technology, attention is the invisible boundary between routine and rupture.
In a gas turbine shutdown, dozens of signals compete for attention, vibration data, heat gradients, sequencing indicators. Missing one input can result in mechanical stress or safety risk. Systems need to be structured to reduce noise and highlight change.
In financial trading, as positions scale and information floods in, awareness can collapse into habit. Decision-makers revert to the last known model. Early cues of volatility or liquidity freeze are missed not because they weren’t visible, but because attention was already saturated.
In tech operations, incident response hinges on identifying root causes before cascading effects take hold. But when multiple teams, platforms and logs are in play, awareness becomes fragmented. Interfaces, team structure, and communication discipline all influence whether awareness remains intact.
These are not knowledge problems. They are attention problems. And they require design, not blame.
What This Means for Resilience.
Resilience is often seen as the capacity to recover. But in practice, it’s also the capacity to notice early, to detect small deviations before they become systemic failures.
That begins with attention.
In a resilient system, awareness is not an individual responsibility. It is a shared and supported function.
Tools, teams, and leadership practices all contribute.
So do rituals like briefings, post-mortems, and scenario planning, not because they are bureaucratic, but because they allow attention to reorient.
If we want to prevent failure, we need to understand how people miss what they were trained to see. Not because they were careless, but because they were full, cognitively stretched, environmentally cued, and shaped by systems that didn’t fully support the human brain.
Situational awareness breaks first. But with the right design, it can also be the first line of defence.
Coming Next: What We Think Is Happening.
In Part 2, we’ll move from perception to comprehension, exploring how mental models shape not just what we see, but what we believe we’re seeing.
We’ll look at how expertise, routine and story-building support decision-making, and how they can quietly lock us into the wrong assumptions when the world changes faster than our thinking.
Because once attention slips, our next challenge is interpreting the data we did catch and that’s where mental models either save us or lead us further off course.

Comments