The Operationally Excellent Sinking Of The Titanic
Kobus Cilliers | On 01, Mar 2020
One of the most enduring workshop features in the TRIZ community is the â€˜save the Titanicâ€™ exercise first conceived by Ellen Domb. Nominally it is about resources, and as such often gets used alongside the 9-Windows tool. In many ways, it is the gift that keeps on giving. Sometimes Iâ€™ll find myself in a room with self-professed Titanic experts. Sometimes â€“ probably one workshop in five â€“ Iâ€™ll still hear great solution ideas Iâ€™ve never heard before. Only very rarely do groups not find convincing ways to save all of the 2224 (officially) people on board the ship (Minnesota teachers, Saudi-Arabian engineers and Generation-Z apprentices being the only statistically significant groups Iâ€™ve been able to observe not solving the problem). As such, it offers up a wonderful example of the power of psychological inertia. Not just from the 1500 souls that lost their lives on the night the ship hit the iceberg, but from the groups that participate in the exercise, many of whom, I notice looking at each other panic-stricken as I describe what I want them to achieve in the five minutes I allow them to look for resources and think about how they are going to deploy them.
In the last couple of months, Iâ€™ve been experimenting with reframing the exercise as an (excuse the pun) ice-breaker, positioned as close to the beginning of a workshop as I can manage, and very definitely at a point before Iâ€™ve introduced any TRIZ/SI concepts, or, indeed, any thinking tools at all. The new framing is set around a theme that increasingly sets the context for many of the innovation-related workshops we conduct. And that is the fundamental difference between â€˜Innovation Worldâ€™ and the everyday â€˜Operational Excellence Worldâ€™ of business as usual that exists in almost every kind of enterprise on the planet.
My provocation to groups here is that the primary reason for the death of 1500 people was not so much psychological inertia in general, but psychological inertia specifically emerging from the Operational Excellence World thinking that increasingly dominates how everyone above the age of eight sees the world, and learns to be successful in the world. If Iâ€™m feeling particularly brave or provocative, my message is that Operational Excellence thinking is a terrifically efficient way of operating in the world, until the moment it kills you.
Operational Excellence kills. Operational Excellence in complex and chaotic situations kills absolutely.
So, how can this assertion be justified? Especially since there was no such thing as â€˜operational excellenceâ€™ in 1912. There wasnâ€™t but, the underpinning â€˜DNAâ€™ of how humans think definitely was:
- If the ship is â€˜unsinkableâ€™, then training for sinking-like emergencies equals â€˜wasteâ€™, and therefore wasnâ€™t done. When it became apparent that the ship was sinking, no-one really had a detailed idea of what needed to be done.
- When the problem becomes visible to people, they do what â€˜naturalâ€™ human instincts tell us to do and jump straight into â€˜actionâ€™. Apart from the band continuing to play, everyone else quickly devolved to running around like headless chickens. Instead of gathering together for five minutes and thinking about resources and a plan of actionâ€¦ which is kind of what the workshop exercise is all about. In Operational Excellence World, â€˜actionâ€™ beats â€˜planningâ€™ or â€˜thinkingâ€™ every day of the week.
- When people realise there is a problem, â€˜instinctsâ€™ kick in. Which, in the early morning of 15 April, meant people being told to put on life-vests. Which in turn meant a fair number would freeze to death once they found themselves in the water. They were floating though, so I suppose it was helpful from the perspective of retrieving the bodies. Operational Excellence is all about training peopleâ€™s instincts to follow the beautifully optimized rules. However, following rules that are no longer applicable, as the Titanic showed, can easily become a very dangerous thing. Someone needs to establish when â€˜the rulesâ€™ do and do-not apply. When a controlled system devolves into chaos, it highly likely that the â€˜normalâ€™ rules no longer apply.
- Operational excellence happens when people follow â€˜the rulesâ€™. â€˜The rulesâ€™ have been developed in order to ensure the maximum level of organizational efficiency. â€˜The rulesâ€™ determine what â€˜common senseâ€™ means. Anything that therefore doesnâ€™t fit the rules tends to be rejected. Thus precluding any ideas that are in any way counter-intuitive. A really good strategy on the night of the Titanic sinking would have been to open up holes (e.g. at bilge pumps) in order to allow water into the rear half of the ship. This would have allowed the ship to sink horizontally instead of the front-first dive that actually took place, the stresses from which eventually caused the back of the ship to break. Another counter-intuitive idea that would definitely have helped would have been to collect all of the life-vests and make sure that no-one put them on. Better still, collect all the life-vests, connect them together and make them into something that floated. Or how about, emptying all of the food and potable water stored in the lifeboats? (The lifeboats were designed for people to be in them for several days. In our situation â€“ where the Carpathia ship is a known four-hours away from us â€“ we know that people will only be in the boats for a few hours.)
- Operational excellence also comes attached to command-and-control hierarchies, in which the person at the top of the hierarchy instructs the next lower level, that level then instructs the next level, and so on down to the bottom of the ladder. Then, when everyone knows what is expected of them, their job is to get on and do it. Nothing more, nothing less. When everything is working well, this makes for a sensible way to operate. When the system is not working well, however, command-and-control is almost never an appropriate way to operate. This is particularly the case when there are clear time pressures. The Titanic took two hours to sink from the time of hitting the iceberg, which is very little time for any kind of instruction to feed down from the Captain to the bottom of the crew hierarchy, and very definitely not enough time for any kind of meaningful information from the crew to feed back up to the top of the hierarchy. Far more effective in time-critical, chaotic problem situations is the idea of â€˜commanderâ€™s intentâ€™. This is a way of operating in an increasing number of armed forces â€“ which, when in a combat situation is in many ways the archetypal definition of a chaotic system. The idea of â€˜commanderâ€™s intentâ€™ being that the senior leadership defines the outcomes they are looking to achieve (â€˜make sure everyone stays aliveâ€™) and then empowers smaller groups of soldiers to do whatever they see is necessary within the context of their own unique environment to achieve the stated goal.
- A subset of the hierarchical command-and-control nature of Operational Excellence World says to people, â€˜if youâ€™ve not been told to do anything, donâ€™t do anythingâ€™. There is no such thing as â€˜initiativeâ€™ in Operational Excellence World. When crew members didnâ€™t receive instructions on the night of the incident, they tended to therefore do nothing.
- Even though Operational Excellence world demands that people lower down the hierarchy provide feedback to those higher up, because Operational Excellence works on the principle of â€˜continuous improvementâ€™ that often gets interpreted as â€˜unless youâ€™re bringing me good news, donâ€™t bring me any news at allâ€™. Or versions thereof. â€˜Donâ€™t bring me problems, bring me solutionsâ€™ is a frequently heard management statement. If thereâ€™s bad news, people at the bottom learn their best strategy is to stay quiet and feign ignorance. A good Innovation World manager, conversely, might be heard to say, â€˜donâ€™t bring me solutions, bring me problems.â€™ How often have you heard that expression?
- When a problem occurs within an operationally excellent system, people are expected to look for solutions within the system. People are, in other words, constrained to operate â€˜within the boxâ€™. This despite the fact that much better solutions might exist â€˜outside the boxâ€™. The iceberg, for example, was certainly responsible for causing the Titanicâ€™s sinking problem, but, with 1/10 of it above sea-level, it also offers the potential to be a really useful means of keeping people out of the water until rescue arrives.