There is something romantic about the notion of intuition.
Who doesn’t love the idea of a respected art critic getting a fleeting view of a work of art and knowing immediately that it is a fake? How about a chess grand master walking past a game in the park boldly pronouncing, “white mate in three.” This leads us a nagging feeling that we can become masters of intuition, if only we work hard enough, making any decision a trivial task. The trouble is, most of life renders intuition unreliable.
As Chip and Dan Heath explain in their book Decisive:
What is sometimes lost in the work celebrating intuition is a sense of the relatively limited domain where it can help us make good decisions. A research consensus is now emerging about situations where intuition reliably generates reasonable answers. Robin Hogarth, one of the researchers who have done the most to clarify situations where intuition does and doesn’t work, describes learning environments along a continuum from kind to wicked. When we acquire our intuitions in a kind environment, our gut instincts are likely to be good, but intuitions acquired in wicked environments are likely to be bad. Feedback in kind environments is clear, immediate, and unbiased by the act of prediction. Forecasting the weather for tomorrow is a kind environment. Feedback is rapid (next day) and clear (it snows or it doesn’t). And the act of making a prediction doesn’t bias the outcome—the rain and snow don’t care about the forecaster.1
In contrast, the learning environment in an emergency room is wicked because of the lack of long-term feedback. Most ER docs and nurses get good short-term feedback (I either help the patient stop bleeding or I don’t) but bad long-term feedback, since they don’t see what happens to a patient once he or she leaves the emergency room (e.g., did something we did to stop the bleeding cause greater complications down the road?). The learning environment for new-product launches is wicked on all three dimensions. Feedback is unclear (perhaps Pets.com was a bad idea or perhaps it was just ahead of its time), it is delayed (often for months or years), and it is biased by the very act of prediction (classifying a launch as high priority or low has self-fulfilling ramifications for, say, its ad budget or the quality of the personnel on the launch team). Because of the environments they operate in, we will be better off trusting the intuitions of the weatherman than the entrepreneur or brand manager launching a new product. We should trust the ER doc to find an effective short-term solution to a health crisis but not to recommend good long-term actions for a chronic condition…2
Somewhat depressingly, the situations where we should most trust our instincts don’t characterize many of the most important decisions that we make in life—which college to go to, whom to marry, which product to launch, which employee to promote. Professor Rick Larrick of Duke University has a compact summary of the kinds of environments that have been reliably found to develop good intuition: He calls them “video game worlds”—they are environments that provide quick, unambiguous, unalterable feedback. Video games, however, allow you to die and come back to life multiple times as you learn. For the kinds of decisions that [are covered] in this book, life doesn’t typically allow many do-overs.3
Unless our feedback will be clear, immediate, and unbiased by the act of prediction, we cannot train our intuition. Unless we can test this feedback many times, we cannot trust our intuition. In every other case, intuition is just another unreliable input to the decision making process.