Human beings are masters of overconfidence. Even when we’re wary of a rose-colored outlook, we find it tough to reliably determine whether a particular course of action is wise or not. This effect is not eliminated by combining fallible people together into groups, either. Daniel Kahneman describes this dynamic in his book Thinking, Fast and Slow.

Can overconfident optimism be overcome by training? I am not optimistic. There have been numerous attempts to train people to state confidence intervals that reflect the imprecision of their judgments, with only a few reports of modest success. An often cited example is that geologists at Royal Dutch Shell became less overconfident in their assessments of possible drilling sites after training with multiple past cases for which the outcome was known. In other situations, overconfidence was mitigated (but not eliminated) when judges were encouraged to consider competing hypotheses. However, overconfidence is a direct consequence of features of System 1 that can be tamed—but not vanquished. The main obstacle is that subjective confidence is determined by the coherence of the story one has constructed, not by the quality and amount of the information that supports it.1

Recall that System 1 thinking is the fast variety, which can also be thought of as intuitive or less effortful thinking. In order to explore this issue more fully, in 2009, Kahneman collaborated with Gary Klein, a research psychologist known for his work on experts’ decision making processes in real-world scenarios. Kahneman makes room for his alternative perspective on intuition.

Organizations may be better able to tame optimism and individuals than individuals are. The best idea for doing so was contributed by Gary Klein, my “adversarial collaborator” who generally defends intuitive decision making against claims of bias and is typically hostile to algorithms. He labels his proposal the premortem . The procedure is simple: when the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”2

This mental frame, applied by team leadership, is designed to circumvent the natural loyalty people have to their teams. Kahneman continues,

As a team converges on a decision—and especially when the leader tips her hand—public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty to the team and its leaders. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. The premortem is not a panacea and does not provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of WYSIATI3 and uncritical optimism.4

I think learning to consciously change our mental frames is vital to improving our judgment and decision making. When it comes time to begin your next big project, whether solo or on a team, take time to picture what failure would look like. It might help you find success instead.

  1. Kahneman, Daniel. Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux, 2013. Kindle link↩︎

  2. Kahneman, Kindle link↩︎

  3. “What You See is All There is,” is one of the ways people jump to conclusions. We act as if we have a complete picture, even if this is far from true. What you see is all there is. For more information, see Chapter 7 of Thinking, Fast and Slow↩︎

  4. Kahneman, Kindle link↩︎