Lines in the Sand, Safety Nets, and a Just Culture: Lessons from an Aviation Near-Miss
- Mike Mason
- Nov 24
- 5 min read

On 24 July 2025, a Cessna 206 departed Archerfield airport, Queensland, Australia just minutes before last light. It was a routine training flight planned under instrument flight rules (IFR). The instructor and student had submitted an IFR flight plan with a 17:30 departure time, comfortably before last light at 17:39.
As aviation often reminds us, routine creates the perfect space for small shifts to grow into real hazards. The aircraft took off seven minutes late. And because the instructor elected to depart visually (VFR) rather than conducting the IFR departure they had planned, the aircraft passed last light while still below the required 2,900 ft minimum altitude. That immediately triggered a Minimum Safe Altitude Warning (MSAW) at Brisbane Centre.
Air traffic control intervened, instructed the pilots to expedite their climb, and the flight continued normally. No one was harmed. No aircraft was damaged. But the event contains some powerful lessons not just for pilots, but for any organisation that operates in a complex, time-sensitive environment.
This incident wasn’t about poor flying, nor was it a story of incompetence. It was a story about boundaries, systems, and culture and how each plays a critical role in preventing small deviations from becoming serious events.
Lesson 1: Clear Lines in the Sand Prevent Normalisation of Deviance
The instructor almost certainly retained visual contact with the ground after take-off. The chances of controlled flight into terrain were extremely low. But low isn’t zero, and aviation is built on a long cultural memory of events where “low chance” became “last chance.”
That’s why aviation draws very clear lines in the sand, including the transition from day to night.
Under the rules, once it becomes last light, departures must be conducted under IFR, not VFR, unless very specific conditions are met. This isn’t bureaucracy. It’s a safety boundary.
The danger here is subtle but universal: When there is no hard boundary, humans naturally drift.
A seven-minute delay doesn’t “feel” like a safety risk. A departure two minutes before last light doesn’t “feel” meaningfully different from one nine minutes earlier. Human intuition is smooth and continuous. Safety limits are not.
This is exactly how normalisation of deviance begins:
“It’s only a few minutes later, it’ll be fine.”
“We can still see the ground.”
“We’ve done this before and nothing went wrong.”
And each time nothing bad happens, the deviation silently becomes the new normal.
Clear rules, such as VFR departures must not occur after last light, protect people from the slow slide into unsafe territory. They prevent well-intentioned professionals from talking themselves into “just this once.” And more importantly, they provide a shared, objective boundary that removes ambiguity.
Business parallel:
When organisations lack clear boundaries, whether they be in deadlines, responsibilities, quality standards, approvals, budgets, or safety constraints, people begin to make micro-exceptions. Not malicious ones. Just human ones. Each exception becomes the justification for the next. Great leaders draw clear lines not to limit people, but to protect them. This blog on 'When "normal" becomes dangerous' talks about this in more detail.
Lesson 2: Automated Safety Systems Protect Us From Being Human
When the Cessna left last light while still below 2,900 ft, Brisbane Centre’s ground-based surveillance systems did exactly what they were designed to do: They caught what the humans couldn’t.
The Minimum Safe Altitude Warning (MSAW) was automatically triggered. Air traffic control saw the risk. The controller's attention was focused and they intervened instantly.The aircraft climbed. The hazard disappeared.
This is how modern safety systems should work. We sometimes talk about automation as though it replaces the human, but in safety-critical environments, its power is far more humble and far more important:Automation compensates for human limitations.
Humans are excellent at judgement, nuance, creativity, and decision-making.We are terrible at continuous, repetitive monitoring. Computers are the opposite.
The instructor wasn’t “careless”, they were operating a busy training flight, managing a student, managing the aircraft, managing communication, and managing twilight departure conditions. Their job was complex. The computer’s job was simple: If altitude < MSA → alert.
These seemingly simple components are a vital part of any complicated or complex system.
Business parallel:
Organisations often rely on humans to monitor things that humans are not good at monitoring: deadlines, compliance tasks, procedural steps, financial thresholds, expiry dates, risks that “shouldn’t happen,” etc. Automated systems (software, alarms, workflows, dashboards) aren’t a luxury; they’re a safety net. Smart leaders design systems that incorporate these aspects of automation that don't make the decision for the human, they make it easier for the human to make the right decision at the right time.
Lesson 3: A Just Culture Reveals Hazards Before They Become Accidents
One of the most striking parts of the ATSB’s investigation was what it uncovered behind the event. Through interviews, collaboration, and system-level review, investigators learned something previously unrecognised:
There is a potential conflict between IFR departures from Archerfield and IFR arrivals into Brisbane runway 01R — both using 3,000 ft over the same area.
This wasn’t discovered because something went catastrophically wrong.
It was discovered because aviation has a culture where:
people are encouraged to speak openly
investigators ask system-level questions
data is shared transparently
there is no fear of sanctions
learning is prioritised over blame
stakeholders collaborate rather than defend themselves
This is part of what organisational theorists call a just culture. Read more about it in this blog.
Hazards are surfaced voluntarily because people feel safe to talk about them. Mistakes are shared, not hidden. Weaknesses are acknowledged, not denied. As a result, the entire system gets safer, not just the individuals involved.
Business parallel:
Most organisations never discover “system-level conflicts” until something explodes. This is because employees learn very quickly that raising concerns leads to extra work, blame, pushback, or political risk. In just cultures: people tell the truth early, leaders say “thank you for the heads-up", systemic problems are addressed proactively, learning is continuous, vulnerability is rewarded, not punished. If your team cannot openly discuss risks, then your system is accumulating them.
Final Thought: Safety Isn’t About Flying Perfectly. It’s About Designing Systems That Don’t Require Perfection
This incident ended safely because:
rules created clarity
automation caught the drift
culture uncovered deeper risks
That’s the model for every high-performing organisation.
You don’t build safety by assuming people will always get everything right. You build safety by designing systems that:
catch small mistakes early
prevent drift before it becomes danger
make safe behaviour easy
reward openness and learning
reveal hidden conflicts before they become major failures
Aviation didn’t become safe because pilots are perfect. It became safe because the system knows they aren’t and supports them accordingly.
And that’s a lesson every business needs to hear.
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments