Did You Know
According to research, during a crisis your brain may use several tricks to deceive you into thinking that your rash decisions are well reasoned decisions.
Why it Matters
During a crisis and especially at the start, when stress, emotions and ambiguity are particularly high, you will often observe crisis managers making poor decisions and following through on them until it's too late and damage is done. Afterwards, they often wonder why they took the decisions they did when better options were clearly on the table. This sort of “option blindness” is a frequent and serious problem in crisis management, especially among those crisis teams that don't employ decision-making safeguards. Research offers insights that helps break down this problem into more manageable parts.
The significant impact of biases and heuristics (mental shortcuts) on routine decision-making is well established and frequently discussed in management courses and articles, so I won't focus on that very much here.
But what do we know about decision-making biases under crisis conditions? How do crisis conditions such as time-pressure, stress, and ambiguous and fluid situations affect our biases, and to what extent? Which of the 100+ biases get amplified during a crisis and which ones tend to kill individual and team performance in a crisis? These are central questions in crisis management but still a relatively new topic of research. Psychologist and Nobel laureate Daniel Kahneman gives us with a few useful insights on thi, buried in various parts of his book, that I attempt to summarize here.
First, in high pressure and/or ambiguous situations, our brain has a tendency to use mental shortcuts and jump to conclusions, even more so than in everyday decision-making. In what psychologists call "strong situations", our brain's default setting is speed, not analysis, and our brain can generate scenarios and decision options faster by relying on our memory of previous experiences than by processing the facts of the ground. Our brain uses the facts on the ground to construct a viable story in our mind but disregards many key facts in the process. In practice, this seems to lead to a number of common and serious mistakes among crisis teams, such as: acting promptly on the first "easy to imagine" option that is put on the table, acting in a way that is inconsistent with the situation or facts on the ground (but consistent with the story or outcome they want), and being overly optimistic that the approach or solutions they used in the last crisis will work again.
Second, and to make the situation worse, our brain also employs an array of powerful and unconscious tricks to convince us that these automatic and biased decisions are good choices. In his famous book “Thinking Fast and Slow, Kahneman explains that our brain not only "invents causes and intentions" but also "..tricks you into feeling that you are making well-reasoned decisions" that are "..reinforced by pleasant feelings, illusions of truth and reduced vigilance”. In these conditions, our brain pumps out an array of hormones and chemicals, like dopamine, that make us feel better and more confident about our decisions - no matter what they are. It's no surprise then that in a crisis, we have a hard time changing our course of action. Our brains may have evolved this powerful trick as a means of encouraging (or forcing) us to act decisively in order to get out of a threatening situation. Whatever the reason, it does not serve us very well is most (corporate) crisis management situations*. In practice, this may partly explain why we often see crisis managers and teams overestimating their chances of success, sticking to a course of action when it's clearly not working, and dismissing alternatives. One thing is very clear: our crisis teams stand little chance unless they systematically employ decision-making safeguards.
So the next time you find yourself in a crisis and feeling pretty good and comfortable with an important decision, it's worth pausing to ask: did I process it through a consultative decision-making process? Is it indeed based on facts on the ground, critical thinking and good reasoning? Or is my brain playing tricks?
Crisis teams should employ at least one formal and consultative decision-making process that engages critical thinking. It is also a good idea to employ additional safeguards, such as assigning a dedicated advisor or team to engage in critical thinking/reflection in support of the leader.
Please comment below if you have observed this during a simulation or real crisis.
*This effect may not be as pronounced in well trained and experienced emergency teams that deal with recurring similar scenarios.