Beyond “Failed To”: What Business Leaders Can Learn from the F-15 Kingsley Field Accident Report
- mikemason100
- Sep 15
- 6 min read

In May 2023, an F-15D from the Oregon Air National Guard overran the runway at Kingsley Field after a hydraulic failure left the pilot with limited braking options. The aircraft was destroyed, the pilot survived, and, as with all such incidents, an Accident Investigation Board (AIB) was convened to uncover what happened and why.
The AIB’s role is not just to identify what went wrong, but to generate learning so that future events can be prevented. Yet when you read the Kingsley Field report closely, you see opportunities for deeper insights that weren’t fully explored. For leaders in aviation, business, or any high-performance domain, these gaps matter because they show us how easy it is to miss the real lessons when we focus on blame or rule-breaking rather than context and decision-making.
Let’s look at three areas where the investigation could have taken a different approach, and what lessons business leaders can draw from them.
1. The Problem with “Failed To”
One of the most striking aspects of the report is its repeated use of negative language: “the pilot failed to…”, “maintenance failed to…”, “ATC failed to…”.
On the surface, this may seem like a clear way to describe actions that didn’t occur. But words mean worlds. “Failed to” implies incompetence, negligence, or wilful disregard, when in reality, the decisions made often make sense at the time with the information available.
Take the line: “The pilot failed to engage the Emergency Brake/Steer System.” That phrase paints the pilot as careless. A more accurate and constructive framing would be:
“The pilot chose not to engage the Emergency Brake/Steer System, based on concerns about blown tyres and loss of control at high speed.”
The first version ends the conversation: someone failed = case closed. The second opens the door to learning: why did the pilot make that choice? Was it the right choice? If not, what can be done to support better decisions in the future?
Business Parallel: Reviews and Debriefs
Think about performance reviews or post-project debriefs. How often do we say, “the team failed to deliver on time” instead of asking, “what led the team to prioritise X over Y?” By shifting language from blame to curiosity, you create an environment where people feel safe to share the real reasons behind their decisions and genuine learning happens.
If a product launch slips, blaming the team for “failing” to meet the deadline doesn’t help. Asking what information, constraints, or pressures shaped their choices does. Was the deadline unrealistic? Were priorities unclear? Were resources misaligned? The language you use signals whether your organisation is about blame or about learning.
Practical takeaway: Replace “failed to” with language that explores context and decision-making. Instead of “you failed to escalate the issue,” try “what made escalation difficult in that moment?” It changes the conversation from judgment to insight and, crucially, makes it about something not someone.
2. Stopping at the Rule Break
The report identifies the pilot’s decision not to use the emergency braking system as the “cause” of the accident. Once that rule was seen to have been broken, the investigation seemed to stop digging.
But the pilot himself stated that he consciously made that choice because in previous situations, using the emergency brakes had led to blown tyres and dangerous directional control problems. In other words, his decision was based on lived experience and a rational attempt to avoid making things worse.
This is where the real opportunity for learning lies. Instead of simply declaring, “you broke the rule,” the investigation could have asked:
What experiences had conditioned pilots to avoid the emergency brakes?
Was the guidance around when and how to use them clear enough?
Could simulator training help pilots practise these scenarios safely?
Should there be established “decision gates” (for example, a specific speed or distance-to-run) where a go-around or alternate action becomes the better option?
The point isn’t that the pilot got it wrong. The point is that the system had not given him the tools, clarity, or practice to confidently make what it thought was the right decision (with hindsight) under pressure.
Business Parallel: Going Beyond Compliance
In corporate life, we often stop at the surface when a process isn’t followed. “They didn’t use the system properly.” But rarely do we ask why. Maybe the system is cumbersome. Maybe prior experience has taught employees that “workarounds” are safer or faster. Maybe training didn’t prepare them for the edge cases.
For example, if a sales team consistently avoids using the prescribed tool, leaders could simply say, “they failed to follow procedure.” Or they could dig deeper: Is the tool clunky? Does it slow down their ability to engage customers? Are they punished for spending “too much” time on data entry? Only by understanding the why can leaders improve the system.
Practical takeaway: When someone breaks a rule, don’t stop at the breach. Ask: what pressures, incentives, or past experiences made that option seem like the best choice? Then redesign systems, training, and guidance to support better choices next time.
3. Ambiguous Communication and the Need for Standardisation
A contributing factor in the accident was ambiguous communication between the pilot and the tower controller. The pilot requested “cable” to stop the aircraft with his arrestor hook (like they use on aircraft carriers), but ATC interpreted this as a request to lower the cable, when in fact the pilot needed it raised. By the time the misunderstanding was caught, it was too late.
The AIB rightly noted the miscommunication, but there was no recommendation to change or standardise the terminology. In aviation history, this is exactly the kind of lesson that usually drives change. After the Tenerife disaster in 1977, the word “takeoff” was restricted to positive clearance only — no ambiguity, no wiggle room. That linguistic change has almost certainly saved countless lives since.
Imagine if this investigation had taken a similar step: defining specific, standard phrases for cable status requests, eliminating the risk of misinterpretation under stress. That’s how organisations grow safer.
Business Parallel: Eliminating Vagueness
Communication failures aren’t unique to aviation. In business, vague instructions like “ASAP” or “let’s circle back” leave room for misinterpretation. One person thinks “ASAP” means by the end of the week, another thinks it means within the hour.
High-reliability organisations eliminate ambiguity with standardised language. In software teams, definition of "done” clarifies what “complete” actually means. In project management, RACI charts specify roles and responsibilities so “we thought they were doing it” doesn’t derail delivery.
Practical takeaway: Audit your team’s language for ambiguity. Replace “soon,” “urgent,” or “high priority” with clearly defined terms and timelines. The more precise the language, the fewer the misunderstandings.
Lessons for Business Leaders
The Kingsley Field accident highlights three broader lessons for leaders across industries:
Language shapes learning. Replace “failed to” with framing that explores choices and context. Language that judges shuts down learning; language that inquires opens it up.
Rules aren’t the end of the story. Dig into why decisions were made, and use that insight to refine systems, training, and guidance. Rule breaches are often signals of systemic gaps.
Communication must be standardised. Remove ambiguity by agreeing on precise terminology, especially when stakes are high. Shared language equals shared understanding.
Whether you’re leading a fighter squadron or a corporate team, the principle is the same: accidents and setbacks are opportunities to learn, not to blame. But that learning only happens if you’re willing to look beyond the surface, challenge assumptions, and make systemic improvements.
At On Target, we translate these lessons from combat aviation into the business world, helping leaders build teams that perform under pressure, communicate with clarity, and learn from mistakes. Because in the air and in business, the cost of missed learning is always higher than the cost of the mistake itself.
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments