When Investigations Don’t Lead to Change: Lessons for Business from an Aviation Accident
- mikemason100
- Sep 22
- 5 min read

Accident investigation reports are meant to do more than describe what happened. Their purpose is (or at least should be) to uncover why an event unfolded the way it did, and most importantly, what can be changed to prevent it from happening again.
That’s why this particular report into a midair collision that took place during the Reno Air Races in 2023 felt unsatisfying. While interesting in detail, it ultimately produced no changes. Without actionable recommendations, an investigation risks becoming nothing more than a historical account, a document that informs, but does not improve.
This is a lesson that resonates far beyond aviation. In business, post-mortems, project reviews, and after-action reports often fall into the same trap. They describe the event, note the outcome, but fail to extract the learning needed to do things better next time. If we’re not using these opportunities to adapt systems, behaviours, or processes, then the effort is wasted.
What struck me about this investigation was how much was left unexplored. Because the aircraft involved were older, there was little recorded data. Instead, the report relied heavily on witness testimony. The conclusion that the two pilots “did not see each other” is fairly unsurprising, in fact, it’s almost inevitable. But is that enough? Surely the value lies not in stating the obvious, but in asking: what made it more likely that the pilots didn’t or wouldn't see or anticipate each other?
This is where richer insights need to be drawn, and where parallels to business are clear. Let’s look at three areas that could have provided stronger takeaways, both for aviation and for organisations everywhere.
1. Non-standard Patterns: When Variability Becomes the Norm
The report notes that the pilots flew non-standard traffic patterns. But how “non-standard” was this behaviour, really? Was it a rare deviation or a common practice? If the latter, then what’s considered “non-standard” may in fact be normalised deviation. Over time, people adopt shortcuts, workarounds, or informal methods that become accepted practice — until something goes wrong.
In aviation, if pilots frequently flew patterns outside the published procedures, it’s worth asking: why? Was it easier, faster, or did it help them manage local conditions? More importantly, was it tacitly tolerated because it usually caused no problems?
In business, we see this every day. Teams deviate from standard processes because it’s quicker, because the tools don’t quite fit reality, or because “that’s how we’ve always done it.” These deviations may appear harmless, but they represent cracks in the system. When the environment changes from a high-pressure deadline, a staffing shortage or a surge in workload, those cracks can become failures.
The lesson: don’t just highlight non-standard behaviour; explore why it happens and whether the system itself encourages or tolerates it. Non-standard behaviour may actually be innovation to be embraced but the conversation needs to start to confirm if that is actually the case. None of us are as smart as all of us.
2. System Limits: How Much Complexity Can We Handle?
At the time of the accident, there were numerous aircraft airborne in a small area, with separation left largely to the pilots themselves. This raises a critical systems question: how many variables can a human-based system handle before it reaches breaking point?
Relying on pilots to “see and avoid” might work with two or three aircraft in the circuit. But add more traffic, different skill levels, environmental conditions, and competing pressures, and the system’s margin quickly erodes. The investigation notes the congestion but doesn’t go further to ask: where is the limit? And how do we prevent that limit from being exceeded?
For business leaders, this is a familiar challenge. Every organisation has a threshold for complexity. A small team might manage with ad-hoc communication and flexible roles, but scale that up and the same approach collapses. Without defined systems, oversight, and load-balancing, the organisation is effectively leaving “separation” up to individuals — a risky strategy.
The lesson: instead of assuming the system can cope, actively explore its limits. Ask where it breaks, how it signals overload, and what safeguards are needed before reaching that point. A pro-active "Red Teaming' approach can help here. Read our other blog here to find out more about "Red Teaming".
3. Communication Processes: Ambiguity Is the Enemy
The report acknowledges that ATC’s lack of communication was a contributing factor. But it stops there, without offering guidance on how communication should be improved. Were the processes vague? Were expectations unclear? Was it cultural? Was there a reluctance to speak up, or a belief that pilots would self-manage?
This is another missed opportunity. In aviation, history has shown us that communication needs structure, clarity, and standardisation. After the 1977 Tenerife air disaster, for example, the word “takeoff” was reserved exclusively for positive clearance to prevent ambiguity. Clear, universally understood language is essential in high-stakes environments.
Business is no different. Vague communication: “ASAP,” “make it better,” “handle this quickly”, creates space for misinterpretation. Add multiple teams, time zones, or pressures, and those ambiguities multiply. The result is missed expectations, duplicated effort, and in some cases, outright failure.
The lesson: communication processes must be specific, structured, and agreed upon. Clarity isn’t optional; it’s a safety net.
Turning Observations Into Action
The most notable aspect of this report is not what it found, but what it didn’t pursue. By stopping at obvious conclusions and not recommending changes, it offered little in terms of prevention or improvement.
For businesses, the takeaway is clear: after-action reviews, audits, and investigations must go beyond description. They should ask:
What conditions made this outcome more likely?
Where did the system’s limits show?
How can communication be made clearer and more resilient?
The goal is not just to document events but to use them as leverage for learning and improvement.
Final Thoughts
Accident reports, like business reviews, carry enormous potential. They can be catalysts for change, helping organisations see hidden risks, refine processes, and prevent recurrence. But only if they look deeper than the surface, beyond the obvious, and ask the harder questions.
This report may have documented a tragedy, but it didn’t fulfil its greatest purpose: to prevent another one. In our businesses, let’s not make the same mistake.
At On Target, this is what we specialise in: helping teams go beyond “what happened” to uncover why it happened, and how to build systems, communication, and culture that prevent recurrence. Whether in the cockpit or the boardroom, the objective is the same: not just to survive, but to continually improve.
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments