top of page

The Last Person to Touch It: Why the Swiss Bar Fire Was a System Failure, Not Just the Owner's Failure

  • Mike Mason
  • Jan 13
  • 5 min read
The moment the swiss bar fire started
Image courtesy of @visagrad24 on X captures the moment the fire started.

When the recent bar fire in Switzerland killed 40 people, public attention moved quickly to accountability and blame. Within days, one person had been arrested: the bar owner. At first glance, that feels reasonable. The owner operated the venue. Modifications had been made. The fire started there. If someone must be held responsible, surely it’s the person “closest” to the event.


The issue with this logic – the last person to touch it must be at fault – is that its one of the most reliable ways to stop organisations from learning. Once we zoom out, the story becomes far more complex.


The bar had not been fire-safety inspected for around five years, despite inspections being required. Local inspection teams were reportedly undermanned and overtasked. Previous inspections had identified issues, but follow-up was inconsistent. Ultimately, those present on that fateful night believed it was safe. The system, collectively, had come to believe that risk was under control.


Yet, when the system failed, only one individual was placed in handcuffs. This isn’t about excusing poor decisions. It’s about recognising that serious failures are almost never caused by a single person acting alone. They emerge when multiple safeguards fail or quietly erode over time.


The Comfort of Blame and the Illusion of Closure

Blaming the last person to touch the system provides something powerful: emotional closure. It creates a simple story:

  • Someone failed

  • Justice was done

  • The problem is solved


But it’s a false sense of safety because removing an individual usually does nothing to fix:

  • understaffed inspection teams

  • weak regulatory follow-up

  • ambiguous responsibilities

  • design flaws baked into the venue

  • cultural assumptions that “if it passed last time, it’s fine”


In complex systems, accidents don’t happen because one person wakes up and decides to do something dangerous. They happen because multiple layers of protection are missing, weakened, or assumed to be someone else’s job. The owner may have been the last person to “touch” the system but they were not the only person shaping it.


Everyone Thought It Was Safe.

One of the most uncomfortable truths in tragedies like this is that, at the time, no one thought they were operating unsafely.

  • The owner believed the venue was acceptable.

  • The staff believed they were working in a legitimate business.

  • The regulators believed inspections had been completed in the past.

  • The inspection system believed it was managing risk across the region.

  • The customers believed the environment was safe.


This is how systems fail in the real world, hardly ever through wilful negligence, but through collective confidence built on assumptions. When nothing goes wrong for years, safety stops feeling like something that needs active attention. It becomes background noise. Once that happens, small deviations: missed inspections, temporary workarounds, informal approvals all start to feel normal.


This is known as normalisation of deviance: the gradual acceptance of risk because adverse outcomes haven’t yet occurred. The danger is that by the time the outcome does occur, the deviation is no longer visible as deviation. It’s “just the way things are done.”


Design and Oversight Matter More Than Individual Intent

With hindsight, it’s easy to identify flaws in the venue:

  • interior materials that burned rapidly

  • escape routes that were insufficient

  • charging practices that created ignition risks

  • reliance on manual processes rather than layered protections


But hindsight also reveals something else: the system was fragile by design. A robust safety system does not rely on:

  • one person remembering to do the right thing

  • one organisation noticing a problem

  • one inspection catching everything

  • one layer of defence holding indefinitely


Robust systems assume humans are fallible and compensate accordingly. In aviation, this principle is fundamental. Inspections overlap. Automation monitors continuously. Independent oversight challenges assumptions. Near misses are reported and shared.

In this case, inspection regimes were infrequent, under-resourced, and reactive. The owner became, in effect, the final barrier – a role no individual should ever carry alone. And yet, when that barrier failed, the legal system has treated it as if it had always been the only one (so far at least).


The Missed Opportunity: Inquisitive Cultures and Pre-Mortems

One of the most striking absences in stories like this is structured curiosity.

What if, years earlier, someone had asked:

  • If a fire starts here, how does it unfold?

  • Where do people go?

  • What assumptions are we making about detection and escape?

  • What would worry us most if we were sleeping below deck or below ground?


These are pre-mortem questions – deliberate attempts to imagine failure before it happens.

High-reliability organisations use them routinely. They invite red teams, independent reviewers, and external challenge not to assign blame, but to surface uncomfortable truths while there is still time to act.


In this case, there was no structured mechanism forcing those questions to be asked. Curiosity was optional. And optional curiosity rarely survives workload, time pressure, or commercial incentives.


Business Parallel: Who Gets Blamed When Your System Fails?

This story isn’t unique to hospitality, fire safety, or regulation. The pattern appears everywhere:

  • A project fails → blame the project manager

  • A safety incident occurs → blame the frontline worker

  • A compliance breach happens → blame the person who signed last

  • A customer data breach occurs → blame the IT lead


In each case, the “last person to touch it” becomes the lightning rod. If your system requires flawless behaviour from individuals to remain safe, it is already unsafe.


The organisations that genuinely improve after failure are the ones that ask:

  • What allowed this to make sense at the time?

  • Where were our safeguards thin or absent?

  • What pressures shaped the decisions people made?

  • Which assumptions went unchallenged for too long?


That’s where learning lives.


Final Thought: Systems Fail Long Before People Do

The Swiss bar fire accident did not begin the night flames appeared. It began years earlier with missed inspections, fading curiosity, normalised shortcuts, and an under-resourced oversight system.


Arresting the owner may satisfy a desire for accountability. But it does little to prevent the next tragedy if the system remains unchanged.


If we truly want safer outcomes in hospitality, transport, healthcare, or business, we must stop asking “Who was last to touch it?” and start asking “Where did the system quietly give up?”


The most dangerous failures are the ones everyone thought were impossible. Right up until they weren’t.

-----------------------------------------------------------------------------------------------------------------------------------------------------


On Target Co-Founders. Mike Mason and Sam Gladman

Mike Mason and Sam Gladman are the co-founders of On Target, a leadership and team development company that brings elite fighter pilot expertise into the corporate world. With decades of combined experience in high-performance aviation, they specialise in translating critical skills such as communication, decision-making, and teamwork into practical tools for business. Through immersive training and cutting-edge simulation, Mike and Sam help teams build trust, improve performance, and thrive under pressure—just like the best flight crews in the world.


If you'd like to learn more about how On Target can help your team, contact Mike and Sam at info@ontargetteaming.com

 
 
 

Comments


bottom of page