I was working a late night shift as an Emergency Room physician in February of 2003, when I started thinking about the recent Space Shuttle Columbia disaster that resulted in the death of seven astronauts. It was upsetting on multiple levels, but my one persisting thought was: "If the best and brightest of NASA management could be responsible for such a disastrous decision - what hope was there for me and my decision skills in the ER?" Since no profession is exempt from serious decision errors, is there a simple way to improve our critical decision making process? What can we learn from this disaster?
"Shuttle Thinking" resulted from those rare, quiet ER moments when I would put my feet up on my desk and try to analyze my own decision making process, searching for ways to improve it. I studied the Colombia disaster and compared it to my own style of making decisions. If the Columbia had been a patient, what would I have done differently? How could I improve my own decision process and then share it with others? "Shuttle Thinking" is what I now collectively call a set of five common pitfalls that I believe undermine our critical decision making process.
Shuttle Thinking -- 5 Pitfalls in Critical Decision Making:
Unique situations
Data deficit
Emotional denial
Probability
Positive reinforcement
To improve my decision making process I now consciously examine the impact of "Shuttle Thinking" on every high level decision I make.
Now, I want to walk you down this "Shuttle Thinking" path, using the Columbia disaster as an example. As you explore this path, feel free to insert other examples that may come to mind. Recently, large financial institutions and government decisions have provided a horn of plenty! Hurricane Katrina or the sinking of the Titanic will also work, since they all follow the same path. Let's begin.
-------------------------------
As you recall, shortly after Columbia's launch, a piece of insulating foam about the size of a phone book apparently broke off from the external fuel tank, hitting the left wing. The extent of the damage to the left wing was not known. NASA managers felt that no action was needed, and the Columbia was allowed to return to earth - a normal, uncomplicated re-entry was expected. However, after the loss of the Space Shuttle Columbia and crew, subsequent findings of the Colombia Accident Investigation Board (CAIB) criticized the decisions of NASA management.
1--Unique Situations
Unique situations, by definition, have no learning curve. NASA management had no training manual solution for the Space Shuttle Columbia incident. Scenario planning for such an event had never been proposed. NASA management evaluated the situation as it unfolded. Therefore, they became the learning curve. As is often the case with many of our bad decisions in unique situations, the eventual horrific outcome was never even an initial consideration.
KEY: Unique situations must be approached cautiously, considered inherently risky and dangerous, and should be considered initiators to poor decision making.
2--Data Deficit
Sometimes, not enough data exists to help you make wise decisions. Important decisions are sometimes made on little or no information. In the case of Columbia, there was no available information to determine if the left wing of the craft had been damaged. There were limited structural sensors in the wing and there was no robotic video camera to visualize the wing. The area in question was not even visible from Columbia's windows. So, not only was minimal data available, but there were few options for obtaining any additional data. Launching another Shuttle to "fly by" the Columbia to take a "visual" and check for damage was not a simple option. Extra Vehicular Activity (space walk) was also a difficult option. However, there was an option to use Defense Department technologies (possible satellite or telescopic technologies) to look at the wing, but NASA management did not exercise this option.
KEY: Data deficits, with inadequate information to properly effect a critical decision, make it mandatory to obtain additional data.
3--Emotional Denial
As humans, given the variable outcome choices in our daily lives, we may naturally tend to gravitate towards the positive and tend to ignore or even deny the fact that painful, negative outcomes are possible. If we did not have this propensity towards optimism, we might become paralyzed in our daily activities - we might even avoid taking that "risky" commute to work. Some outcomes, such as the Columbia breaking up on re-entry, are so uncomfortable that we often choose to not give them the full consideration they deserve. Many times, denial of the difficult or threatening components of our decisions allows us to choose the easier, more comfortable choices. In the case of the Columbia, the easiest decision was to deny that there was a serious problem and to do nothing.
KEY: Emotional denial frequently shifts our decision making towards the easier, more comfortable solutions.
4--Probability
This is the pivotal point in our decision making process. I certainly was not in the NASA conference room during their risk assessment of the Columbia. However, I can imagine that NASA management struggled with their unique situation, used the limited available data, and finally opted not to obtain any additional data. This initial problem analysis, coupled with a degree of denial of the severity of the situation, likely allowed them to conclude:
"There is probably no damage caused by the foam piece and
nothing further needs to be done."
Would the outcome have been different if the NASA team restructured their conclusion by replacing the word "probably" with the word "gamble" ?
"NASA management has decided to gamble the lives of the 7 crew
members by assuming that the foam piece caused no damage."
KEY: Probability, either consciously or unconsciously included in your final decision,
should be considered synonymous with gambling. The full extent of the gamble then needs to be considered. Probability = Gambling
5--Positive Reinforcement
Long before Columbia, NASA management had noted smaller pieces of foam breaking off during multiple previous Shuttle missions. Because no problems resulted from these foam events in the past, they knowingly accepted the fact that small pieces of foam break off. These foam events were subsequently considered to be a normal mission variant. In other words, NASA management had gambled in the past - and won. This winning mindset unfortunately minimized their perceived risks and reinforced their willingness to continue to gamble.
KEY: Gambling and winning tend to reinforce the option of taking additional risks.
ER physicians are faced with reoccurring probability dilemmas every shift. Early in my career I learned that once you have made the diagnosis of "probable heartburn or indigestion" in 100 patients, you then expose yourself to the risk of 1 of those 100 patients returning to the ER with an actual heart attack instead of heartburn. Ninety-nine patients were correctly diagnosed with indigestion, but one patient returns with a true myocardial infarction (heart attack). Do you continue to gamble with your probable diagnosis style knowing that the 1 case in 100 will eventually return? The practice of medicine is replete with similar examples, and physicians eventually learn by trial and error, that unless you completely verify the diagnosis by searching out additional data such as an electrocardiogram (EKG) and heart blood tests, the laws of probability will eventually catch up with you. Because physicians are faced with these reoccurring decision scenarios, in statistically large enough numbers, they rapidly learn the consequences of making a diagnosis based on probability. (By the way, patients often misinterpret this need for additional testing [data] as the physician practicing "defensive" medicine; when, in reality, the physician is trying to protect the patient from the rare event and the laws of probability.)
NASA management faced a unique situation, used the limited available data, opted not to obtain additional information, and likely had a degree of denial of the severity of the situation. All of this resulted in a critical decision gamble that lost. The remote possibility of a Columbia disaster, that eventually became a reality, was not given the full consideration it deserved.
We are all constantly surprised how very smart people and their teams make seriously flawed decisions. No person, company, or government agency is immune. If your decisions are based on poor data and probability, eventually your luck will run out. Whether it is Lehman, NASA, or Katrina, many of our most flawed decisions share the same common process. Sometimes when you gamble large, you lose large.
The most important step towards better decision making is early recognition of this "Shuttle Thinking" pattern and the role of "probability" in your decisions. To improve my decision making process I now consciously examine the impact of "Shuttle Thinking" on every high level decision I make.