When Routine Turns Risky: The Hidden Cost of Complacency
- mikemason100
- Nov 3
- 5 min read

On the night of June 1, 2009, Air France Flight 447, an Airbus A330 flying from Rio de Janeiro to Paris, vanished over the Atlantic Ocean. The aircraft crashed after entering an aerodynamic stall from which it never recovered, killing all 228 people on board.
What intrigued investigators wasn’t just that a modern, sophisticated airliner could fall out of the sky, it was how it fell out of the sky. There was no explosion, no catastrophic mechanical failure, no storm that should have been fatal. Instead, a chain of small, manageable (routine) events aligned to result in tragedy.
The autopilot disconnected when the aircraft’s airspeed sensors (pitot tubes) temporarily froze which was a known and relatively minor issue. The pilots suddenly found themselves hand-flying an aircraft at high altitude which was something they rarely practised. Within seconds, confusion set in. Instruments disagreed. The stall warning sounded, then stopped. The crew pulled the nose up instead of lowering it.
For the next four minutes, the aircraft climbed, slowed, and fell. The pilots didn't know they were in a stall until it was too late to recover from. To many, this was yet another story of “pilot error.” To those of who study human factors and performance, it was something far deeper. It was a lesson in what happens when routine breeds complacency, systems erode skill, and normal becomes dangerous.
Lesson 1: Routine Success Breeds Invisible Risk
The Air France crew faced a situation that, on paper, was well within their capability. Frozen pitot tubes weren’t new. Autopilot disconnects weren’t new. But because these events were rare and almost always recovered from easily, the risk felt theoretical. However, on this occasion, the additional confusing indications displayed to the crew didn't make it obvious what was actually happening.
The pilots were highly experienced, but their experience had been shaped by thousands of hours of routine operations under automation. Manual flying at high altitude which is quite a delicate and occasionally unforgiving task was a skill they hadn’t practised in years. In short, they were experts in normal, not in failure.
Similar patterns appear in organisations everywhere. Teams become so accustomed to things “just working” that they stop questioning why. Success can hide fragility. Systems that have always delivered are assumed to continue to do so.
The phrase “If it isn’t broken, don’t fix it” becomes a trap because by the time something does break, it might be too late to recognise it or, even worse, do anything about it.
Business takeaway: Audit success as critically as failure. Every smooth project, every quarter that “just worked,” should invite questions:
What would happen if this system failed?
When was the last time we tested the human response to failure?
Are we maintaining skills, or just relying on systems?
Routine success breeds invisible risk. Challenge “normal” before it challenges you.
Lesson 2: Automation Should Assist, Not Replace
Air France 447’s pilots were victims of what psychologists call automation complacency: the belief that technology is managing everything correctly. When the autopilot disconnected, the crew didn’t immediately grasp what was happening. They believed the auto-throttle would maintain airspeed, even though it had also disengaged.
Automation has made aviation safer but it’s also created new risks. The more capable the systems become, the less engaged humans tend to be. Over time, skill fades, vigilance dulls, and critical thinking narrows to “monitoring” rather than “managing.” Sound familiar?
In business, dashboards, AI tools, and automated processes promise efficiency. Don't get me wrong, they also deliver it. But they also create distance between people and process. Leaders stop asking how data is produced. Teams assume the system knows best. The first time something goes wrong, nobody knows what to do manually.
Business takeaway: Technology should amplify human capability, not anaesthetise it. Build training and culture that keep people “in the loop,” not just “on the loop.”
Ask regularly:
Do we understand what our systems are doing?
Could we operate manually if we had to?
Are our people problem-solvers or passive monitors?
Automation should support awareness, not replace it.
Lesson 3: Training for the Unthinkable
One of the most haunting aspects of Flight 447 is that the pilots never recognised they were in a stall, despite the continuous warnings. This was because their mental model didn’t fit what they were seeing. The stall warning sounded intermittently due to inconsistent data. It didn’t feel like a stall.
They were well-trained, but only for expected problems. The rare, confusing, or “can’t happen” events weren’t in their playbook.
The same sort of problem occurs in organisations when teams train only for predictable issues: Fire drills, incident playbooks, and continuity plans cover what’s been seen before, not what’s emerging. A better test of resilience isn’t how you perform under known pressure; it’s how you think under surprise.
Business takeaway: Don’t just train for procedures, train for uncertainty. Run simulations, red-team exercises, or scenario planning sessions that deliberately create confusion. Force your teams to think, adapt, and communicate under ambiguity. Resilience isn’t built through perfection; it’s built through rehearsal for the imperfect. There might not even be a 'right' answer but delving into these 'unknown unknowns' expands our ability to deal with the unforeseen if and when it occurs.
Lesson 4: When Culture Rewards Compliance Over Curiosity
A subtle but powerful factor in the Flight 447 tragedy was culture. The airline’s operating style, like many at the time, emphasised compliance and procedural discipline. The pilots followed checklists perfectly. They weren’t reckless or lazy. They were doing exactly what their training and expectation demanded.
The problem was that discipline had replaced curiosity. They didn’t question the instruments, the automation mode, or even each other. Everyone assumed the data must be right.
In business, this same mindset appears when teams follow processes so rigidly that they stop thinking critically. “We did what the system said.” “That’s what the policy requires.” “It’s not my call.” Process is valuable, especially for novices, but it must never suppress questioning, especially as knowledge and experience grows.
Business takeaway: Build curiosity into compliance. Encourage teams to challenge assumptions, ask “why,” and explore anomalies. The moment curiosity disappears, safety and performance begin to erode.
Lesson 5: Leadership Under Uncertainty
The senior pilot on Flight 447 had stepped away for rest when the crisis began, leaving two less experienced crew members in charge. When he returned, confusion reigned. Communication was fragmented and nobody had the full picture.
This wasn’t poor leadership, it was an example of how authority and understanding can quickly decouple in crises. The person in charge may not have the best situational awareness, while those closest to the problem may feel unable to speak clearly. While there is a tension here ultimately the leader needs to bring it all back together.
Business takeaway: During high-stress moments, leaders must create clarity, not control. Their job is to integrate fragmented perspectives and not impose decisions from limited understanding. Good crisis leadership doesn’t demand obedience; it demands information flow.
Final Thoughts: The Fragility of “Normal”
Flight 447 wasn’t a freak event, it was the predictable outcome of an environment where technology masked fragility, training reinforced normal, and culture rewarded procedure over curiosity.
The same dynamics live inside every organisation that has grown comfortable with success.
“Normal” feels safe but it’s often where risk hides best. The question for leaders isn’t “Are we safe?” but “What assumptions are keeping us feeling safe?”
Because when routine turns risky, it’s never sudden. It’s gradual. Invisible. Normal. Until it isn’t.
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments