top of page

Five Seconds to Disaster

  • Writer: mikemason100
    mikemason100
  • Mar 9
  • 5 min read
F-35 on the ramp. Image courtesy of the USAF
The F-35A. Image courtesy of the USAF

This post is about what happens when complexity, distraction and assumptions collide.

On the night of 19 May 2020, an F-35A Lightning II was returning to Eglin Air Force Base in Florida after a routine training sortie. The landing should have been uneventful. Instead, within five seconds of touching down, the aircraft bounced, became unstable, and the pilot ejected as the F-35A departed the runway and burst into flames. The aircraft, worth roughly USD176 million, was completely destroyed.


The investigation found that the aircraft landed around 50 knots faster than the 'correct' landing speed, touched down at an unusually shallow angle, and entered a violent oscillation that the pilot could not recover from.


As with almost every accident in aviation, there is more to the story than excess speed. As with most of our blogs, we'll talk about how it was about complexity, human performance, training assumptions, and systems that behaved differently than people expected.


And the lessons apply far beyond aviation.


Problem 1: The Wrong Speed

The immediate cause of the accident was straightforward: the aircraft landed at 202 knots, around 50 knots faster than appropriate for the aircraft’s weight.


Landing too fast meant the jet touched down at a very shallow angle of attack, which led to a bounce. Normally, a bounce is recoverable. However this one triggered a chain reaction. As the aircraft bounced, the pilot attempted to regain the correct landing attitude. The aircraft’s flight control system began reacting to the oscillations and pilot inputs in ways that quickly became mismatched and opposed. Within seconds, the pilot felt the aircraft was no longer responding to his commands.


The pilot attempted a go-around with full afterburner but the aircraft continued oscillating. After several worsening bounces and only a few seconds to react, he ejected.


From the outside, it’s easy to reduce this to a simple conclusion: the aircraft landed too fast.

My concern is that explanation hides the far more interesting, and more useful lessons.


The Distraction That Happened at the Worst Possible Moment

Shortly before landing, the pilot noticed that the Helmet Mounted Display (HMD) appeared misaligned with the horizon. This system projects flight information directly onto the pilot’s visor. The misalignment was subtle but significant. It meant the pilot had to mentally reconcile conflicting information from different instruments during a critical phase of flight. Lets face it, landings are important.


To make matters worse, the display produced a distracting “green glow” effect as the aircraft descended toward the runway, forcing the pilot to squint through the imagery to see outside cues. What this all means is that the pilot’s attention was pulled away from monitoring one of the most important parameters in aviation: airspeed.


In business terms, this could be the equivalent of a team focusing on a secondary problem while a critical performance metric quietly drifts out of tolerance.


The Trap of Automation

The aircraft was using an automated system called Speed Hold, part of the auto-throttle functionality. Landing with Speed Hold engaged is prohibited, but it wasn't disengaged during the approach.


Because of this, the pilot’s normal cross-check routine never occurred. The system was holding speed, (which is what it was designed to do) but the speed it was holding was too high for landing.


Automation is designed to reduce workload. But it can also inadvertently create complacency, where operators trust the system and stop monitoring the basics. This is not unique to aviation and is in fact quite normal behaviour.


In business, we see the same pattern when organisations rely on dashboards, algorithms or automated processes without maintaining the habit of actively checking the fundamentals.

Automation works best when humans remain actively engaged, not when they assume the system has everything under control.

When Training Teaches the Wrong Lesson

Perhaps the most interesting finding from the report was related to simulator training.

Investigators discovered that the F-35 simulator allowed pilots to land the aircraft successfully even at the same high speed seen in the accident. In fact, investigators themselves were able to land the simulator under the same conditions without losing control.


The real aircraft, however, behaved very differently. This meant the pilot had unknowingly developed negative learning. His training suggested the situation was recoverable because he had previously recovered it in the simulator.


In organisations, this happens more often than we like to admit. Training exercises, rehearsals, and simulations are supposed to prepare people for reality. But if they don’t accurately replicate real conditions, they can build potentially dangerous assumptions.

People walk away believing they know how a situation will unfold. Until it doesn’t...


The Hidden Role of Fatigue

Another contributing factor was cognitive degradation at the end of the sortie.

The pilot reported that flying the F-35 routinely created a level of mental fatigue due to physiological factors such as breathing through the aircraft’s oxygen system and the cognitive demands of managing the aircraft’s systems.


On this particular night, the pilot rated his fatigue level higher than usual. Fatigue is more than just making people tired. It will narrow attention, reduce situational awareness and slows decision making.


In complex environments, those effects can really matter. If fatigue meets distraction and system complexity, the margin for error and the number of risk controls that provide redundancy shrinks quickly.


The Five-Second Reality of Complex Systems

One of the most striking aspects of this accident is how quickly it unfolded. From touchdown to ejection was roughly five seconds. In that small window, the pilot had to diagnose an unfamiliar and definitely unexpected aircraft response, determine whether the aircraft was recoverable, attempt corrective actions, and decide whether to eject.


Five seconds is not a great deal of time to process what is happening. That is the nature of complex systems. When things finally go wrong, they often do so very quickly.


Which is why prevention matters far more than reaction.


The Business Lessons

Although this was an aviation accident, the underlying lessons apply directly to organisations.

1. Complexity multiplies failure pathways

Advanced technology increases capability. It also increases the number of ways things can go wrong.

2. Distraction hides the real problem

People rarely miss the obvious because they are careless. They miss it because their attention is pulled somewhere else.

3. Automation requires supervision

Systems should assist humans, not replace their judgment.

4. Training must reflect reality

If training environments do not accurately mirror the real world, they can create dangerous confidence.

5. Fatigue is a performance risk

Human performance degrades under fatigue, especially in high-complexity environments.


The Useful Lesson

The destruction of a $176 million aircraft wasn’t the result of one catastrophic mistake.

It was the result of multiple small factors aligning:

  • a slightly misaligned display

  • a system left engaged

  • a fast approach

  • fatigue

  • misleading simulator experience

  • complex flight control logic

Individually, none of these would necessarily cause a crash. Together, they created a situation that unfolded faster than any human could fully diagnose.


So the real lesson for organisations is that Catastrophic outcomes rarely begin with catastrophic decisions.They begin with small deviations that quietly compound until the system runs out of margin.

-----------------------------------------------------------------------------------------------------------------------------------------------------

On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com.

 
 
 

Comments


bottom of page