top of page

When Automation Fails: Business Leadership Communication Lessons from Asiana Flight 214

  • Writer: mikemason100
    mikemason100
  • Oct 28
  • 5 min read
The wreckage of Asiana Flight 214. Image courtesy of the NTSB
The Wreckage of Asiana Flight 214. Image courtesy of the NTSB

On July 6, 2013, Asiana Airlines Flight 214, a Boeing 777 from Seoul, approached San Francisco International Airport under clear skies. As the aircraft neared the runway, it was far too low and too slow. The tail struck the seawall, the fuselage broke apart, and three passengers died. Miraculously, 304 others survived.


At first glance, it looked like yet another a classic case of “pilot error”, a failure to monitor airspeed on approach. But the NTSB report revealed a far more complex picture involving automation dependency, miscommunication, and cultural barriers that shaped how the crew worked together.


And as is so often the case (as you've probably realised if you've read some of our other blogs), the lessons reach well beyond aviation, into leadership, team dynamics, and organisational resilience.


Lesson 1: Automation Is a Tool, Not a Crutch

Asiana 214’s pilots were relying heavily on the Boeing 777’s automated systems and in particular, the auto-throttle, which essentially maintains speed by managing engine power. They believed it was maintaining speed during the approach. It wasn’t. In a specific configuration, the auto-throttle was inactive, a nuance that was poorly understood.

The crew’s trust in automation overrode their situational awareness. They assumed the system was doing what they wanted, rather than verifying it.


In business, while not likely to be deadly, automation complacency can have serious consequences. We increasingly rely on technology such as dashboards, analytics, AI, and automated workflows to make complex decisions for us. But when we stop verifying, cross-checking, or questioning what the system tells us, we give up control. Automation should enhance human performance, not replace it. When teams treat tools as infallible, they lose the curiosity that keeps systems safe and effective.


The takeaway: Build a culture or system of verification, not blind trust. Ask “Is this really doing what we think it is?” instead of assuming technology has it handled.


Lesson 2: Communication Must Bridge, Not Bow to, Hierarchy

Inside the cockpit, the captain was training a junior pilot. The first officer, acting as instructor, noticed the aircraft was below the glidepath but hesitated to challenge the captain directly. The cultural norms of deference and politeness, known in aviation as authority gradient, prevented assertive communication. When the captain finally recognised the situation, it was too late. The go-around decision came seconds before impact.

This wasn’t a lack of skill — it was a lack of communication resulting in lack of Situation Awareness resulting in a decision made too late.


In business, the same thing happens daily. Team members spot issues but don’t raise them because they don’t want to “overstep.” Junior staff defer to senior leaders, assuming “they must know best.” Projects drift toward failure while everyone stays politely silent. The most effective teams actively flatten hierarchy. They train members to challenge decisions respectfully and expect feedback from all levels.


Leaders must make speaking up safe, and silence unsafe. In aviation, this philosophy is a big part of what is now called Crew Resource Management. In business, it’s all part of psychological safety, the foundation of high-performing teams.


Lesson 3: Shared Awareness Is a Skill, Not a Given

As the Boeing descended through 500 feet, the three pilots each believed someone else was flying, monitoring, or managing power. In reality, no one had the full picture. Each operated with partial understanding, assuming the others were in control. This is the essence of lost situational awareness, and it’s rarely caused by one person. It’s far more usually a breakdown in how information is shared and confirmed among a team.


In complex business environments, the same thing happens. Sales thinks operations has checked the numbers. Operations assumes finance signed it off. Everyone’s “sort of aware,” but no one owns the full picture until something goes wrong. Depending on the specific context, shared awareness must be actively maintained through structured communication: check-ins, summaries, and explicit role clarification.


The takeaway: Never assume shared understanding. Build it and confirm it deliberately. Whether you’re landing a plane or launching a product, clarity about “who’s doing what” might just make the difference.


Lesson 4: Blame Fixes Nothing, Learning Fixes Everything

The NTSB report was remarkably balanced. While identifying pilot error, it also highlighted systemic contributors: unclear automation logic, training gaps, and cultural barriers.

Contrast that with how most businesses respond after failure. Reports that read, “The operator failed to…” or “The manager neglected to…” might sound decisive and make the report writer feel better, but they kill learning. Aviation’s evolution after Asiana 214 wasn’t about punishing individuals, it was about changing systems: clearer automation design, more realistic training, and improved CRM.


In business, accountability should drive learning, not punishment. When something goes wrong, ask:

  • What conditions made this mistake possible?

  • How can we make the right action easier next time?

Blame feels satisfying. Learning is harder, but infinitely more valuable.


From Cockpit to Conference Room: The Business Parallels

The story of Asiana 214 is a case study in what happens when smart people operate in complex systems under pressure:

  • They trust technology too much.

  • They hesitate to speak up.

  • They assume understanding is shared.

Sound familiar?


In business, success often depends not on intelligence or skill, but on how teams manage communication, challenge, and clarity under pressure. The same human factors that cause air crashes can cause project failures, financial losses, and cultural breakdowns. The best organisations learn from aviation’s transformation:

  • Build verification into systems. Don’t trust, cross-check.

  • Encourage challenge. Flatten authority when it matters most.

  • Cultivate awareness. Make the invisible visible through open communication.

Because in the end, whether you’re flying an airliner or running a business, safety and success rely on the same thing: humans working together effectively to achieve their goals, especially when under pressure.


Final Thought

The crash of Asiana Flight 214 wasn’t about incompetence, it was about the silent traps of modern systems: trust without verification, politeness without clarity, and structure without flexibility. Aviation learned the hard way that systems must be built around how humans actually behave, not how we wish they would.


Business leaders would do well to do the same.

-----------------------------------------------------------------------------------------------------------------------------------------------------

On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com


 
 
 

Comments


bottom of page