top of page

Was This Always Going to Happen? Lessons from the loss of HMNZS Manawanui

  • Mike Mason
  • Mar 31
  • 7 min read
HMNZS Manawanui sinking
HMNZS Manawanui sinking. Image courtesy of Tom Udall

The story

On the evening of 5 October 2024, HMNZS Manawanui was conducting survey operations off the southern coast of Upolu, Samoa. The conditions were far from benign. Winds were sitting at around 20–25 knots, the sea state was elevated, and the ship was operating in close proximity to a reef.


This was routine work for the ship. The task was to collect hydrographic survey data in support of an upcoming international event, which meant operating in a defined area while maintaining a precise survey pattern.


At around 17:46, the ship was under autopilot, making slow progress at approximately 6 knots. The Officer of the Watch confirmed they had control. A discussion followed on the bridge about how to adjust the survey pattern to fill gaps in the data. It was a normal conversation, the kind that happens frequently during this type of operation.


Shortly afterwards, the ship began a turn and something subtly changed. Control inputs did not produce the expected response and, while attempts were made to alter course and manage the ship’s movement, the vessel continued to drift towards the reef. Commands that were expected to slow or stop the ship did not have the intended effect.


There was no single dramatic failure. No clear moment where everything suddenly broke down. The situation evolved. Attention among the crew was divided and their understanding of the ship’s behaviour was incomplete. The margin for recovery quickly narrowed. And then the ship grounded.


What followed was a series of impacts as the vessel struck the reef multiple times before becoming stranded. Wave action drove the ship against the reef, causing progressive structural damage. Water entered the vessel, stability degraded, and the crew were ultimately forced to abandon ship.


The ship was lost. Everyone survived.


What the report says was the direct cause

The report identifies a set of direct causes, focusing on the immediate conditions that led to the grounding. These include: the ship leaving the intended survey area,; ineffective control of the vessel; and a loss of positional awareness relative to the reef.


These statements are accurate. They are also incomplete.


They describe what happened in the final moments before the grounding, yet they do not explain why those actions made sense to the people involved at the time. They present the outcome as a deviation from what should have happened, rather than as the natural result of how the system was functioning.


If we stop at this level, the learning becomes limited. The story becomes one of error and correction, rather than one of understanding.


Contributing and aggravating factors

The report goes on to outline an extensive range of contributing and aggravating factors, including weaknesses in risk management, confusion in policy and procedures, training system inadequacies, and issues with documentation and assurance.


Taken together, these factors point towards something much bigger than the actions on the bridge that evening. They describe an organisation that was most definitely under strain.

There are references to unclear or overly complex operational instructions, gaps in training and experience, and inconsistencies in how risk was understood and applied. There are also indications that systems designed to provide oversight and assurance were not functioning as effectively as intended. These are systemic issues.

.

And yet, even here, the report tends to describe what was missing rather than exploring why those gaps existed or how they shaped behaviour in real time. The result is a list of deficiencies without a clear narrative of how those deficiencies combined to produce the outcome.


The problem with counterfactuals

Throughout the report, there are numerous counterfactual statements describing what individuals should or could have done differently.

One example stands out:

“The Court considers that this should not have distracted [the individual]…”

This type of statement is common in investigations. It is also fundamentally unhelpful.

People get distracted. That is not a failure to be corrected through instruction, it is a reality to be managed through system design.


Saying someone should not have been distracted does not explain why the distraction occurred, why it was not recognised as significant at the time, or what in the environment made it possible for that distraction to influence the outcome.


Counterfactuals create a clean version of events that did not happen. They assume clarity, hindsight, and perfect awareness. The people involved did not have those advantages. If we want to learn, we need to understand the world as it was experienced at the time, not as it appears after the fact.


Hollowness: the system beneath the surface

A recurring theme in the report is the concept of hollowness. This describes an organisation that is stretched, operating with reduced capacity while still being expected to deliver the same level of performance. It is a powerful and often underappreciated factor.


Hollowness shows up in many ways: Reduced experience levels. Thinner supervision. Greater reliance on individuals to fill gaps. Increased complexity in managing routine tasks.

Under these conditions, people adapt. They make trade-offs. They prioritise what seems most important in the moment.


Most of the time, those adaptations work. In fact this is one of the reasons that humans are awesome. However, occasionally, the adaptions don't work.


The point to consider is that Hollowness does not cause failure directly. It helps to create the conditions where failure becomes more likely.


Risk management in theory and practice

The report highlights deficiencies in risk management and recommends reviewing policies, procedures, education, and training. At face value, this sounds reasonable.


In practice, it raises quite difficult questions:


What does effective risk management look like in a complex, time-pressured environment? How do we define what “good” looks like before the event?

What training would genuinely change how decisions are made on the bridge for unexpected situations?


At the time, it is highly likely that those involved believed they were managing risk appropriately. Their actions were shaped by their understanding of the situation, the tools available to them, and the organisational context in which they were operating.


Adding more policy or training may create a sense of action and closure. It does not necessarily change how risk is perceived and managed in reality.


Procedures, assurance, and the illusion of control

The report identifies significant deficiencies in operational instructions and notes that routine assurance activities should have identified these issues (another counterfactual).

This is a critical point. If the assurance process did not detect these problems, then the question is not simply why the crew used unsuitable procedures. The question is:


Why didn’t the system designed to detect those issues work as intended?


There is a tendency in organisations to assume that assurance provides an accurate picture of reality. In practice, assurance often reflects how work is documented rather than how it is performed.


If the gaps were not visible to those responsible for identifying them, it is unrealistic to expect those operating within the system to have a clearer view.


Training, competence, and the risk of box-ticking

The report also highlights weaknesses in training, posting, and record-keeping systems, followed by detailed recommendations about qualifications, experience, and documentation.

There is a real risk that this leads to increased complexity without improved understanding.


As more requirements are added, the focus can shift towards demonstrating compliance rather than developing capability. The more boxes that need to be ticked, the easier it becomes to lose sight of what actually matters. Competence is not simply a function of qualifications and records. It is shaped by experience, context, and the environment in which people operate.


Hollowness, revisited

The report returns to hollowness as a key organisational issue. For me, this is where everything begins to connect. The deficiencies in training, the complexity of procedures, the weaknesses in assurance, and the challenges in risk management can all be linked back to a system that is stretched.


When organisations operate in this state, they rely on people to bridge gaps that should not exist. They normalise conditions that would otherwise be seen as unacceptable.


Over time, this becomes the new normal. Until something breaks.


The problem with blame

Despite the acknowledged range of organisational issues identified, several senior members of the crew are facing court martial and potential prison sentences. This raises a fundamental concern.


If individuals who are known to be operating within a stretched system, under time pressure, with unclear procedures and limited support, are held personally accountable to this extent, what message does that send to the rest of the organisation?


It suggests that, regardless of systemic conditions, responsibility ultimately sits with those at the sharp end. This has predictable consequences.


People become less willing to report issues. They become less open about mistakes. They become more focused on protecting themselves than improving the system. A culture of learning cannot grow in an environment where the ultimate consequences of failure are primarily punitive.


Final thought

This was not an accident caused by a single mistake. It was the result of a system that had become stretched, complex, and difficult to navigate. The people involved were doing their best with the conditions they were given.


Those conditions shaped their decisions, their actions, and ultimately the outcome. If we want to prevent similar events, the system's focus cannot be on punishing good people or trying to find better ones. It has to be on building better systems.


When people are consistently asked to perform in conditions that are less than ideal, the outcome is not a matter of chance.


It is a matter of time.

-----------------------------------------------------------------------------------------------------------------------------------------------------

On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com.

 
 
 

Comments


bottom of page