top of page

Why Didn't They Follow the Rules? Lessons From an F-35 Near Miss.

  • Writer: mikemason100
    mikemason100
  • Mar 23
  • 7 min read
Tindal RAAF F-35. Image by SGT David Gibbs
F-35 at RAAF TIndal. Image taken by SGT David Gibbs

In August 2025, a light aircraft and a military fast jet came ridiculously close to each other over northern Australia.


A Piper PA-28, flying as part of an air race, lost electrical power en route to Royal Australian Air Force Base Tindal. At the same time, two F-35s were returning to land. Without radio communication, without a transponder, and without being visible on radar, the PA-28 continued inbound.


Both aircraft, unaware of each other, lined up for the same runway and separation reduced to around 72 metres laterally and just 25 feet vertically. While this might sound like a lot, I can promise you it is not. In aviation terms this is about as close as it gets without actually hitting each other.


When reading the investigation, it is difficult not to notice a familiar pattern in the findings. The report concludes with a series of statements describing what people did not do. The engineer did not assess battery endurance. The pilot did not conduct contingency planning. The pilot did not monitor the electrical system effectively. The pilot did not divert. The controller did not detect the aircraft. This naturally leads to a question:


Why didn’t they follow the rules?


We like this question and we feel that by asking it, we can put the 'blame' in the hands of those involved. The problem is, this question doesn't really help us learn anything. It is far more useful to learn why they did what they did, not why they didn't do something.


The illusion of explanation

Statements about what people did not do can feel like explanations. They provide a sense of closure, suggesting that if the rules had simply been followed, the outcome would have been different.


In reality, they leave some of the most important questions unanswered. Consider the maintenance decision before the flight. The report states that the engineer encouraged the pilot to continue without assessing how long the battery would last after diagnosing that the alternator had failed. That may be factually correct. However, it tells us very little about why that decision made sense at the time.


Did the engineer believe the battery capacity was sufficient based on past experience?

Was there an expectation that this type of flight could safely be completed on battery power?Was there guidance available to support that assessment, or was it something that relied on judgement?

How often had similar decisions been made without issue?


Without understanding these factors, the statement becomes little more than a description of deviation rather than an explanation of behaviour.


When “not following the rules” makes sense

The same pattern appears in the findings about the pilot. The report states that the pilot did not conduct contingency planning for a loss of electrical power. That may well be true, although there is no exploration of whether the pilot believed such planning was necessary given the assurance that the battery would be sufficient for the flight.


The report also states that the pilot did not effectively monitor the aircraft’s electrical system. Again, this may be accurate, yet it does not explain what the pilot was seeing, what cues were available, or whether the system behaved in a way that made the failure obvious.


The pilot is also described as not diverting to the nearest suitable airfield and not remaining outside controlled airspace. From a procedural standpoint, that might be correct. From a human factors perspective, it raises far more interesting questions.


The pilot was using Whatsapp to communicate with people on the ground and believed a PAN had been declared. They believed air traffic control was aware of their situation. They could see other aircraft on their 'Electronic Flight Bag' (EFB, an iPad that can share aircraft positional information) being held outside controlled airspace and interpreted that as the airspace being cleared for them.


In other words, they were not really ignoring the rules. They were acting based on their understanding of the situation.


The real issue: fragmented situational awareness

This incident has generally been framed as a breakdown in compliance. A far more useful way to view it is as a breakdown in shared situational awareness.


The pilot had limited awareness of the true state of their electrical system and the likelihood of total failure. Once the failure occurred, their ability to build and share situational awareness degraded rapidly. Without a functioning radio or transponder, they became effectively invisible to the systems that normally support safe operations.


Air traffic control, meanwhile, was working with incomplete and sometimes conflicting information. The aircraft was not visible on radar. It was not visible on their EFB because it was using different software. Information was arriving via multiple phone calls, relayed through third parties, often without clarity about its accuracy or timeliness.


Even the F-35 pilots, equipped with advanced sensors, were operating without a clear picture of where the PA-28 was or what it was doing.


Everyone involved was trying to build situational awareness. No one had the full picture.


When systems don’t support the work

One of the most striking aspects of this incident is how many opportunities there were to improve awareness, yet how fragile those mechanisms proved to be under pressure.


The pilot attempted to communicate using Whatsapp. That was actually quite an adaptive response to a degraded situation, and in many ways it was creative and resourceful. However, the lack of standardisation meant that information was fragmented, delayed, and sometimes misunderstood. with far too many people trying to be helpful.


Air traffic control would normally have primary radar to see aircraft without a transponder but on this occasion, it wasn't working. The lack of compatibility between different EFB platforms further reduced visibility.


Communication pathways became complex, indirect, and difficult to manage. None of these elements on their own caused the near miss. However there are plenty of lessons to take forward such that they could be improved and avoid something similar happening in future.


This is a very good example of a situation and a system where situational awareness broke down and could not be reliably re-established or shared.


The risk of the “more rules” response

When incidents are framed in terms of rule-breaking, the organisational response is often predictable. More rules are written. More procedures are added. More emphasis is placed on compliance.


At face value, this creates a sense of action and control. In practice, it often makes the system even harder to navigate. As complexity increases, so does the gap between how work is imagined and how it is actually done. People are required to manage more information, more constraints, and more competing priorities, which makes it increasingly difficult to follow every rule in every situation.


So ironically, more rules can create the very conditions that lead to further deviations.


Where the real learning is

This report contains the raw material for meaningful learning, although much of it sits between the lines rather than in the findings themselves.


There are important questions about how maintenance decisions are made under operational pressure, particularly when guidance is incomplete or relies on judgement.


There are valuable lessons about how pilots interpret and act on incomplete information, especially when they believe that others share their understanding of the situation.


There are significant insights into how air traffic control builds situational awareness when traditional surveillance and communication systems are degraded.


There are also clear opportunities to improve how alternative communication methods, such as mobile devices and electronic flight bags, are integrated and standardised to provide redundancy when primary systems fail.


These are the areas where future risk can be reduced. Not by asking why people failed to follow the rules, but by asking how the system made those actions make sense.


What this means for business leaders

It is easy to look at an aviation incident and see something distant from the corporate world.

In reality, the same patterns exist in almost every organisation. Leaders often ask why people did not follow the process, the policy, or the plan. Investigations frequently conclude with statements about what individuals failed to do.


These explanations feel satisfying. But they rarely improve performance. In complex organisations, people are constantly operating with incomplete information. They are balancing competing demands, interpreting signals, and making decisions based on their understanding of the situation at the time.


When outcomes are poor, it is rarely because people chose to ignore the rules. It is more related to the system not providing them with the awareness, clarity, or support needed to apply those rules effectively.


The most effective organisations recognise this and focus on strengthening how information flows, how understanding is shared, and how systems support decision-making under pressure.


Final thought

The near miss at Tindal was not simply the result of people failing to follow procedures.

It was the result of a system that struggled to maintain shared situational awareness when conditions became degraded.


By the time the aircraft were on final approach, the outcome had largely been shaped by the gaps in that awareness.


If we want to prevent similar events, the question is not:

Why didn’t they follow the rules?


The better question is:

How do we build systems that help people understand what is really going on, especially when things start to go wrong?


When situational awareness is strong, many problems are resolved before they become incidents.


And when it is weak, even simple situations can very quickly escalate towards disaster.

-----------------------------------------------------------------------------------------------------------------------------------------------------

On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com.

 
 
 

Comments


bottom of page