When Everyone Thought It Was Safe: What the MV Conception Fire Really Teaches Us About System Failure
- mikemason100
- Dec 9, 2025
- 4 min read

In September 2019, the dive vessel MV Conception caught fire off the coast of California, killing 34 people as they slept below deck. It remains one of the deadliest maritime disasters in modern US history. In the aftermath, public attention quickly narrowed. One individual, the captain, was prosecuted. The narrative became familiar: lack of crew training, dereliction of duty, and failure to enforce standards. And yet, when we step back, a far more uncomfortable truth emerges that nobody wants to talk about:
Everyone thought things were safe.
The operator believed the vessel was fit for purpose. The crew believed they were operating normally. The Coast Guard believed the boat met regulatory standards. And the passengers, tragically, believed they were protected. This wasn’t a system that knew it was unsafe. It was a system that had slowly convinced itself that everything was fine. That distinction matters. Because if we treat disasters like this as the result of “bad people doing bad things,” we miss the opportunity to fix the systems that quietly allow risk to grow unchecked. And there are a lot of them out there...
Lesson 1: An Inquisitive Culture Exposes Risk Before It Becomes Normal
The NTSB report makes it clear that multiple hazards existed long before the fire:
A single escape route from the bunkroom
No roving night watch
Inadequate fire detection and suppression
Poor battery charging practices
Ambiguous safety responsibilities
None of these were secrets. The problem wasn’t ignorance, it was lack of curiosity.
An inquisitive organisation asks uncomfortable questions before something goes wrong:
What assumptions are we making about safety here?
Where would this system fail if conditions changed slightly?
What keeps us awake at night — and why haven’t we acted on it yet?
Tools like pre-mortems, red teaming (read more about these here) and structured challenge sessions exist precisely for this reason. They allow teams to imagine failure in advance, not as an exercise in pessimism, but as an investment in resilience. In business, we rarely lack data. What we often lack is permission to question long-standing practices that “have always worked.”
If no one is paid, promoted, or protected for asking difficult questions, risk doesn’t disappear — it just goes quiet.
Lesson 2: Normalisation of Deviance Is How Unsafe Becomes “Just the Way We Do Things”
One of the most insidious contributors to the Conception tragedy was normalisation of deviance; the gradual drift away from rules when nothing bad seems to happen.
The absence of a night watch was not a one-off lapse. It had become routine. Battery charging practices were not new. They were familiar. Emergency egress limitations were known, but accepted. Each deviation felt small. Each was survivable, until the night it wasn’t.
This is how risk accumulates in every industry. When shortcuts don’t immediately lead to consequences, they start to feel justified. Over time, yesterday’s exception becomes today’s standard operating procedure.
In organisations, this often sounds like:
“We’ve always done it this way.”
“That rule isn’t really practical.”
“It’s never been a problem before.”
The danger isn’t rule-breaking. The danger is rule erosion without reflection.
High-reliability organisations don’t just enforce standards, they regularly revisit them, asking:
Which rules are being bent?
Why do people feel the need to bend them?
What does that tell us about the system we’ve designed?
When rules are routinely bypassed, it’s rarely because people are reckless. It’s usually because the system makes compliance difficult, inefficient, or unrealistic.
Lesson 3: Oversight Isn’t the Enemy. It’s a Safety Net
Another uncomfortable takeaway from the Conception fire is the role of external oversight.
The vessel had passed inspections. Regulatory bodies did not identify or enforce changes to known risks. In hindsight, this feels like a failure. But oversight bodies don’t operate in hindsight. They operate within the same constraints, assumptions, and norms as everyone else. Healthy oversight isn’t about punishment or box-ticking. It’s about providing an external perspective that challenges internal blind spots.
In aviation, shipping, healthcare, and business, strong systems benefit from:
Independent audits that look beyond compliance
Regulators who ask “what worries you?” rather than “are you compliant?”
Oversight that evolves as operations evolve
The absence of robust challenge allows unsafe designs and practices to persist, not because people don’t care, but because no one is structurally tasked with asking the next-level question.
In business, this might come from:
Non-executive directors
External safety or risk advisors
Peer reviews across departments
Customers or frontline staff empowered to raise concerns
Oversight should not be feared. It should be welcomed as a learning mechanism.
Why Prosecuting the Captain Isn’t the Same as Making the System Safer
Holding individuals accountable feels satisfying. It offers closure. It signals action. We like these feelings. But accountability without learning is theatre. The captain of Conception may well have made mistakes. But focusing solely on individual failure obscures the deeper reality: the system allowed those mistakes to be survivable right up until the night they weren’t.
If we truly want to prevent similar tragedies, in maritime operations, aviation, healthcare, or business we must stop treating disasters as moral failures and start treating them as design failures. Systems that rely on perfection from humans are not safe systems.
The Business Parallel: You’re Closer to “Conception” Than You Think
Most organisations involved in serious incidents didn’t ignore safety. They believed they were safe, right up to the point they found out they weren't.
That’s what makes their stories uncomfortable, and valuable.
Ask yourself:
Where have we accepted risk because “nothing’s happened yet”?
Which rules do people quietly work around?
Who is encouraged to challenge uncomfortable truths, and who isn’t?
When was the last time we tried to break our own system on purpose?
Learning shouldn't start after disaster. It starts with curiosity, humility, and the willingness to look at our systems before they fail us. Because by the time blame feels obvious, learning is already too late.
-----------------------------------------------------------------------------------------------------------------------------------------------------

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.
If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com



Comments