When the System Sets You Up to Fail: Situation Awareness Lessons for Business from Wagga Wagga’s Runway Near Miss
- mikemason100
- Nov 19
- 5 min read

On 15 July 2024 at Wagga Wagga Airport (Australia), a QantasLink Dash 8 and a Piper PA-28 came dangerously close to occupying the same runway at the same time. No one behaved recklessly.Both aircraft made all the required calls. Both crews believed they had a good picture of what was happening. Both operated exactly as their procedures expect.
And yet, they still converged.
The ATSB’s report explains the timeline, discusses situational awareness, and ends with the familiar refrain used in many non-towered aerodrome investigations:
“Pilots must maintain vigilance and situational awareness at all times.”
At face value, this statement is great. But.... You cannot train, brief, or remind people into perfection when the system they’re operating in constantly strips away their awareness.
Wagga Wagga is a known pressure point in Australia’s aviation network. High traffic volume, intensive training operations, mixed aircraft performance, radio blackspots, and lack of an Air Traffic Tower (this is fairly common in Australia for non-Australians) combine to create an environment where even experienced crews routinely struggle to build an accurate picture.
There are real lessons here not just for aviation, but for every business. This event wasn’t caused by a lack of awareness. It was caused by a system that makes awareness hard to achieve.
Lesson 1: Situational Awareness is more than a skill — It’s a Resource
The ATSB focuses heavily on situational awareness (SA). We often talk about SA but rarely consider what it actually is or what it means. In context, SA is more that something people simply have or don’t have. It’s a resource produced by the environment, and at Wagga, that resource can be depleted quickly:
Radio blackspots mean crews cannot reliably receive calls.
The Dash 8 is required to use COM 2 on the ground which is known to be less effective.
Multiple aircraft broadcasting at once cause stepped-on transmissions.
Aircraft using opposite ends of the runway are 1.7 km apart, making visual detection extremely difficult.
A training aircraft may be slow to make decisions; an RPT crew is fast and structured.
A Saab inbound at the same time adds congestion and competing priorities.
This is not a “maintain vigilance” environment. It’s an “awareness attrition” environment.
The Dash 8 crew didn’t hear the PA-28’s taxi call.The PA-28 didn’t communicate directly with the Dash 8.The Dash 8 couldn’t visually detect the PA-28 early enough to prevent the conflict.
Every one of these is predictable in a system like this.
Business parallel:
Telling employees to “pay more attention” is the weakest form of risk control and simply doesn't work. If the environment fragments attention with constant notifications, shifting priorities, poor communication channels, overlapping responsibilities then awareness collapses. Good leaders don’t demand vigilance. They design environments where vigilance is easier to achieve.
Lesson 2: Rules Are Great. Unless They're Too Hard to Follow.
The ATSB provides sensible advice:
Make the right calls.
Monitor the local area frequency.
Don’t assume silence means no traffic.
Maintain lookout.
Organise separation.
Turn on the transponder earlier.
All good advice. But advice is not a safety system. The reality is:
Wagga’s environment makes many of these rules difficult to follow.
The Dash 8’s mandated use of COM 2 reduces reception strength.
The eastern end of the airport has known VHF dead zones.
The PA-28’s transponder was off which is compliant with SOPs but reduces detectability.
The Saab’s repeated calls stepped on by earlier broadcasts.
Direct pilot-to-pilot coordination is recommended but difficult when transmissions aren’t received.
Visual lookout was impossible over 1.7 km, especially with the PA-28’s small profile and angle.
If a rule is technically correct but practically unrealistic, it becomes theatre and not safety.
Business parallel:
Organisations love rules. Policies, procedures, handbooks, compliance frameworks are all designed with good intent. But a rule that is impossible to follow under real workload conditions is not a risk control. It’s a trap, waiting for someone to fall into it. If your systems make compliance difficult, the fault is not with the people. It’s with the system.
Lesson 3: People Don’t Fail in Isolation. Systems Create Their Decisions
Both crews acted professionally. Both made the correct broadcasts for their phase of flight. Both used the information available to them. Both adapted quickly once the conflict became known. The Dash 8 captain reacted immediately and reversed off the runway. The PA-28 instructor rejected the take-off. These are positive examples of decision-making under pressure. The deeper question is this:
How did two aircraft crews develop completely different mental models of the runway environment?
Because the system is built on assumptions that no longer hold up:
“Everyone will hear radio calls.” → Untrue at Wagga.
“Lookout is sufficient.” → Not at 1.7 km with small targets.
“Self-separation works.” → Only when information is complete.
“Procedures ensure safety.” → Not if radios, geometry, and visibility undermine them.
Business parallel:
When projects fail, organisations often blame individuals: “They should have communicated more.” “They should have known better.” “They should have followed the process.”
But individuals act within the conditions the system provides. If those conditions are inconsistent, noisy, overloaded, or under-supported, people will inevitably diverge in their understanding. That’s not human error. That’s a system error expressed through humans.
Lesson 4: Real Learning Requires Asking the Hard Questions
The ATSB report includes recommended reminders, advisories, NOTAMs, and awareness messaging. All fine. But the most powerful learning opportunity remains untouched:
Is Wagga’s system even capable of supporting safe, self-separating operations under peak load?
If the airport has known radio dead zones…If operators are aware COM 2 is weaker…If transponder SOPs reduce detectability…If traffic mixes with wide performance separation…If visibility across the runway is inherently limited… then the question is not “How do we remind pilots to be more aware?”
The question is:
How do we redesign the system so situations like this cannot happen?
That’s where real safety lives.And that’s where real business improvement lives too.
Business parallel:
If your organisation keeps issuing memos about communication, process adherence, or situational awareness……it means the system is failing, not the people. Real change happens when leaders redesign the environment so good performance becomes the default, not the heroic exception.
Final Thought: Systems Create Outcomes, Not People
This incident wasn’t a story of negligence or incompetence. It was a story of two highly capable crews doing their best inside a system that quietly sets them up to fail. A runway incursion is never just about radio calls.It’s about systemic blind spots, literal and organisational. If aviation teaches us anything, it’s this:
**You don’t get safer by telling people to try harder.
You get safer by designing systems where trying harder isn’t required.**
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments