SHIP OF FAIL: How do we right capsized institutions we thought would NEVER go under?

We write a proper bug report, for starters

Crawling from the Wreckage John Watkinson writes the first in a series of essays for El Reg in which he examines failures in society from banking and education to transport and IT. But why whine about this stuff so much, he ponders, when we can simply get on with the business of problem solving?

The last seven or so years since the economic crash has not been much fun for a lot of people, but it has been highly educational. In considering what that education has brought to light, I should make it clear that this piece, and those that may follow, is not a cheap catalogue of disasters and contempt for the perpetrators. I suspect the contempt I can add is a drop in the ocean, but, most importantly, complaining for its own sake serves no purpose.

If areas that are ripe for change can be identified and if tangible courses of action the individual can take can be suggested, then we might make some progress – but only if there is motivation to do so.

Real solutions to problems are rare, because problem solving is an art form. Problems themselves are seldom visible. The visible aspects are symptoms. Making the symptoms go away does not solve problems; in fact it diverts energy and resources from the identification of the problem.

In the problem solving arena, my hat is openly taken off to air accident investigators, who sift through the charred flesh and twisted metal trying to find what went wrong with the simple and laudable goal of preventing a recurrence.

But they are in a minority and most catastrophes are not open to such analysis. Even though it may be in the interests of society, it is not in the interests of those whose actions contributed.

Who, us, NASA, downplay risk?

A perfect example of that was the inquiry into the Challenger Space Shuttle catastrophe on 28 January 1986, in which the only actual technical investigation that dug down into the causes was carried out personally by Richard Feynman.

And the investigating body itself tried to suppress Feynman’s findings and it was only his high ethical standards and high public profile that led to the facts emerging. Basically what he found was that there were several orders of magnitude between the safety assessment of NASA management and the safety assessment of their engineers.

There was not just a failure to assess risk, but known risks were downplayed or ignored.

May 1987 MS Herald of Free Enterprise being towed into the harbour of Vlissingen, Netherlands after salvage.

MS Herald of Free Enterprise being towed into the harbour of Vlissingen, Netherlands after salvage.
Photo by Archief Ranter, licensed under CC 1.0

The broken institution age

Closer to home we saw just that with the loss of the Herald of Free Enterprise, which set sail on 6 March 1987 with the bow doors open because the doors could not be seen from the bridge and there wasn’t an indicator. I had travelled on that vessel several times before it was lost. There but for fortune ...

Downplaying and ignoring of risks leading to catastrophe has become a defining feature of the times we live in, along with hampering investigations and implementing cover-ups. The end result is deserved loss of confidence in the institutions concerned. The problem is that so many of them have been identified as suffering from this syndrome that one could be forgiven for assuming it is universal in the hope of being pleasantly surprised by an exception.

It’s full under the carpet. We seem to be unable to learn from deeply negative events and change our ways and so we are doomed to repeat them. We have been through the stone age, the bronze age, the iron age and the industrial revolution and now our broken society has entered the broken institution age.

We have built these systems that don’t work and, the fact that they don’t work, is staring us in the face. But when we built the systems we were so proud of them that we built them is such a way that they were rigid and incapable of change, so that those administering them didn’t need to make any value judgements. Indeed our pride extended to considering these systems to be infallible and therefore incapable of error or of needing improvement.

But the rigidity that would make them last forever became the inflexibility that would prevent them adapting to change. The only thing we can be certain of is change. Nothing is forever, yet our institutions are structured in such a way that those brave souls who identify problems and suggest change not only cannot be right, but are seen as disloyal and shown the door. They are the lucky ones, because they are not on board when the ship of complacency is deftly steered onto the rocks of change whilst the PR department tells us how safe it is.

A good way to determine the stability of a boat is to rock it. How many times have we been told not to rock the boat? Doesn’t that tell us that the stability of the boat is an illusion? We also have euphemisms that help to keep things they way they are. We have taboo subjects that can’t be discussed. We have ineffability and, as Douglas Adams pointed out, it’s about time we effed it.

Sponsored: Becoming a Pragmatic Security Leader

Next page: Power brokers



Biting the hand that feeds IT © 1998–2019