top of page

What Does Normal Look Like? The Real Lessons From an F-16 Crash.

  • Writer: mikemason100
    mikemason100
  • 4 days ago
  • 6 min read
F-16C. Image courtesy of the USAF
F-16C. Image courtesy of US Air Force

During a routine training sortie at the end of a low-level practise intercept of a light aircraft, an F-16 entered what the pilot believed was an unrecoverable state and crashed shortly afterwards. The pilot safely ejected.


What began as a manageable situation escalated rapidly. The pilot was faced with conflicting cues, time pressure, and narrowing margins. The investigation that followed did what investigations often do: it identified contributory factors, categorised behaviours, and reconstructed what could have been done differently.


Four key contributory factors were highlighted: failure to identify unsafe practices, widespread routine violations, a wrong choice of action, and analysis showing the aircraft was still technically recoverable.


All of those statements may be factually accurate but they leave a deeper question unexplored. What did normal look like before the crash? Unless we explore and understand the answer to this, we are left with conclusions that sound definitive but teach very little.


So lets dig into each of these factors and see what the real lessons are.


Failed to Identify or Correct Risky or Unsafe Practices

The report states that individuals failed to identify or correct risky or unsafe practices. On the surface, that feels clear. If something was unsafe, it should have been stopped. But unsafe according to whom, and when?


Risky practices rarely appear overnight as obvious threats. They evolve gradually. They often begin as minor adaptations to frictions in the system – shortcuts that save time, workarounds that smooth inefficiencies, behaviours that help teams achieve demanding targets. When those adaptations succeed repeatedly without negative consequence, they begin to feel not just acceptable, but sensible. Over time, they stop being seen as risky at all.


This is how a culture forms. Not through reckless intent and instead through repetition and reinforcement.


If a practice truly was widespread enough to influence the accident sequence, then it was almost certainly visible to more than one individual. That means it was tolerated, perhaps even quietly rewarded. Labelling it “unsafe” after the fact does not explain why it felt safe beforehand.


If we want learning, the more useful questions are uncomfortable ones:


How long had this practice existed?

How often had it worked successfully?

What pressures made it attractive?

Who benefited from it functioning smoothly?


Without exploring those answers, “failed to identify risk” becomes a statement of outcome, not insight.


Commits Widespread/Routine Violation

This phrase carries weight. A violation suggests deviation from established standards. But pairing it with “widespread” and “routine” creates tension. If something is routine, it is normal.If something is widespread, it is cultural.


When a behaviour is both, it cannot logically be framed as isolated misconduct. It reflects a gap between written procedure and lived reality.


In many high-performance environments, formal processes describe ideal work. However actual work – the work done under time pressure, resource constraints, and competing priorities — inevitably differs. Teams adapt in order to deliver results. Those adaptations, when successful, become the new baseline.


From there, two things happen. First, the deviation becomes invisible because everyone is doing it. Second, it becomes embedded in expectations of performance. Challenging it would slow the system down.


Only when an adverse outcome occurs does the language shift. What was once “how we get things done” becomes a violation and the danger here is obvious: if we punish individuals for behaviours the system quietly depended upon, we create fear rather than learning. And fear suppresses the very information we need to prevent recurrence.


Understanding normal behaviour requires curiosity about operational reality and not just compliance metrics.


Wrong Choice of Action During an Operation

Describing a decision as the wrong choice is tidy. It implies that a better option was clearly available and simply not selected. However decisions are not made with hindsight. They are regularly made with incomplete information, often under time pressure, and heavily influenced by expectation and training. At the moment of choice, the selected action almost always appears reasonable (and that is why we make it).


To call something the wrong decision without examining why it made sense at the time is to skip the most valuable part of the analysis. What cues were the pilot seeing? What did their experience suggest was most likely? How much time was available to interpret the situation?What alternatives seemed viable in that moment?


Humans do not deliberately choose unsafe outcomes (with the odd exception that is outside the scope of this blog). They generally choose what appears to be the best course of action at the time given their mental model of the situation.


If that mental model was flawed, we must ask why. Was training insufficient? Were cues ambiguous? Did previous experience reinforce a different expectation?


When we label decisions as wrong without exploring the context that made them appear right, we reduce complex cognition to simplistic judgment. That may feel decisive, and it's easy. But it does little to improve future performance.


Human Factors Analysis of Simulation Data

The final contributory element relies heavily on simulation analysis. The data suggests that the aircraft remained technically recoverable for a period of time. In other words, had the correct control inputs been made at the right moment, the outcome might have been different.


Technically, that may be true. But there is a profound difference between theoretical recoverability and realistic recognisability. Physics tells us what is possible. Human factors tells us what is probable.


An aircraft may remain within aerodynamic limits for recovery, but if the cues that signal impending loss of control are subtle, conflicting, or masked by workload, the window for recognition can be vanishingly small.


Saying “the aircraft was recoverable” risks implying that survival was within easy reach. It shifts focus toward execution rather than perception. Yet in many accidents, the critical variable is not whether recovery inputs are known, it is whether the need for recovery or ability to recover is recognised early enough.


Simulation can model control authority. It cannot fully model expectation, stress, ambiguity, or cognitive saturation.


The meaningful question is not whether recovery was mathematically possible. It is whether, in that moment, with those cues and that workload, recognition was realistically achievable.

Without addressing that, the analysis becomes a technical footnote rather than a learning lever.


So What Does Normal Look Like?

Across all four contributory factors, a pattern emerges. The language of the report describes deviation and error. But it does not deeply explore the system conditions that shaped behaviour.


Accidents are rarely born from outrageous acts. They tend to emerge when ordinary behaviour intersects with unusual timing. Something that works most of the time fails at exactly the wrong moment. Margins disappear quickly.


If we want to prevent recurrence, we must understand what daily operations looked like before the crash. What trade-offs were routine? What shortcuts were common? What assumptions were shared? What signals were routinely ignored because they had never mattered before?


That is where prevention lives. In proactivity and finding out what else is normal...


The Business Parallel

Corporate investigations often sound remarkably similar.


A team failed to escalate.

A manager made the wrong strategic call.

Compliance processes were bypassed.

Warning signs were missed.


In each case, the post-event narrative identifies what should have happened. Rarely does it explore why the alternative felt unnecessary or unattractive at the time. Routine deviations are usually adaptive responses to system pressure. Wrong decisions are usually logical within the decision-maker’s frame of reference and are only identified as wrong with hindsight. Missed signals are often weak signals that have never before carried consequence.


If leaders want fewer expensive surprises, they must spend time understanding normal work, not just enforcing ideal work.


That means asking:


What are our teams doing to cope with friction?

What workarounds have become embedded?

Where are we relying on individual vigilance rather than system clarity?

Are we analysing performance through the lens of outcome, or understanding through the lens of context?


Final Reflection

It is easy to conclude that this crash resulted from failure, violation, poor choice, and missed opportunity for recovery. It is harder, and far more valuable, to examine what the entire environment made likely.


Before something goes wrong, behaviour feels normal. If we do not understand what normal looks like, we will continue to describe accidents accurately while learning very little from them.


The real work is not asking who failed. It is asking what made sense, and how to make normal safer tomorrow than it was yesterday.

-----------------------------------------------------------------------------------------------------------------------------------------------------

On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com.

 
 
 

Comments


bottom of page