Yesterday, we distinguished between judgment and decision making. Recalling that judgment is assigning odds prior to making a decision, I stated that in this process we face internal and external challenges. Today I want to focus on the internal.
These days, developing a comprehensive understanding of cognitive biases seems to be step one in learning how to argue on the internet. I am not going to cover ground that is well trod elsewhere, but a definition of terms is appropriate.
Systematic errors are known as biases, and they recur predictably in particular circumstances. When the handsome and confident speaker bounds onto the stage, for example, you can anticipate that the audience will judge his comments more favorably than he deserves. The availability of a diagnostic label for this bias—the halo effect—makes it easier to anticipate, recognize, and understand.1
Generalizing a lot, the positive response we feel towards this handsome and confident speaker is instinctive, while the reasoned, more effortful analysis of this unwarranted response must be more conscious. Daniel Kahneman calls this fast and slow thinking, respectively.
The distinction between fast and slow thinking has been explored by many psychologists over the last twenty-five years… I describe mental life by the metaphor of two agents, called System 1 and System 2, which respectively produce fast and slow thinking. I speak of the features of intuitive and deliberate thought as if they were traits and dispositions of two characters in your mind. In the picture that emerges from recent research, the intuitive System 1 is more influential than your experience tells you, and it is the secret author of many of the choices and judgments you make….2
Beyond the extensive list of the System 1 leaps-to-judgment, a specific area I find fascinating is the way we lie to ourselves through self-justification.
Self-justification is not the same thing as lying or making excuses. Obviously, people will lie or invent fanciful stories to duck the fury of a lover, parent, or employer; to keep from being sued or sent to prison; to avoid losing face; to avoid losing a job; to stay in power. But there is a big difference between a guilty man telling the public something he knows is untrue (“ I did not have sex with that woman”; “I am not a crook”) and that man persuading himself that he did a good thing. In the former situation, he is lying and knows he is lying to save his own skin. In the latter, he is lying to himself. That is why self-justification is more powerful and more dangerous than the explicit lie. It allows people to convince themselves that what they did was the best thing they could have done. In fact, come to think of it, it was the right thing. “There was nothing else I could have done.” “Actually, it was a brilliant solution to the problem.” “I was doing the best for the nation.” “Those bastards deserved what they got.” “I’m entitled.”3
This explains the immediate reaction one might have after making a mistake, but what about after we’ve had a chance to consider the facts?
Now, between the conscious lie to fool others and unconscious self-justification to fool ourselves, there’s a fascinating gray area patrolled by an unreliable, self-serving historian—memory. Memories are often pruned and shaped with an ego-enhancing bias that blurs the edges of past events, softens culpability, and distorts what really happened. When researchers ask wives what percentage of the housework they do, they say, “Are you kidding? I do almost everything, at least 90 percent.” And when they ask husbands the same question, the men say, “I do a lot, actually, about 40 percent.” Although the specific numbers differ from couple to couple, the total always exceeds 100 percent by a large margin. It’s tempting to conclude that one spouse is lying, but it is more likely that each is remembering in a way that enhances his or her contribution.4
Over time, as the self-serving distortions of memory kick in and we forget or misremember past events, we may come to believe our own lies, little by little. We know we did something wrong, but gradually we begin to think it wasn’t all our fault, and after all, the situation was complex. We start underestimating our own responsibility, whittling away at it until it is a mere shadow of its former hulking self. Before long, we have persuaded ourselves to believe privately what we said publicly.5
Are we the bad guy in our own life story? Certainly not, and when reality doesn’t match our preconceptions, it becomes very hard to hold these simultaneously conflicting ideas in our head. Something has to give. This effect is now commonly known as Cognitive Dissonance.
Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity, and, as Albert Camus observed, we are creatures who spend our lives trying to convince ourselves that our existence is not absurd.6
This is a rich area for study, but I think this is a good overview of the internal challenges we face while seeking to improve our judgment. Tomorrow I’ll take a look at the external challenges we face.
Tavris, Carol, and Elliot Aronson. Mistakes Were Made (But Not by Me) Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. Orlando, FL: Harcourt, 2007. Kindle link ↩︎