Emotions are continually affecting our thought processes and decisions, below the level of our awareness. And the most common emotion of them all is the desire for pleasure and the avoidance of pain. Our thoughts almost inevitably revolve around this desire; we simply recoil from entertaining ideas that are unpleasant or painful to us. We imagine we are looking for the truth, or being realistic, when in fact we are holding on to ideas that bring a release from tension and soothe our egos, make us feel superior. This pleasure principle in thinking is the source of all of our mental biases. If you believe that you are somehow immune to any of the following biases, it is simply an example of the pleasure principle in action. Instead, it is best to search and see how they continually operate inside you, as well as learn how to identify such irrationality in others.
I look at the evidence and arrive at my decisions through more or less rational processes.
To hold an idea and convince ourselves we arrived at it rationally, we go in search of evidence to support our view. What could be more objective or scientific? But because of the pleasure principle and its unconscious influence, we manage to find the evidence that confirms what we want to believe. This is known as confirmation bias.
We can see this at work in people’s plans, particularly those with high stakes. A plan is designed to lead to a positive, desired objective. If people considered the possible negative and positive consequences equally, they might find it hard to take any action. Inevitably they veer toward information that confirms the desired positive result, the rosy scenario, without realizing it. We also see this at work when people are supposedly asking for advice. This is the bane of most consultants. In the end, people want to hear their own ideas and preferences confirmed by an expert opinion. They will interpret what you say in light of what they want to hear; and if your advice runs counter to their desires, they will find some way to dismiss your opinion, your so-called expertise. The more powerful the person, the more they are subject to this form of the confirmation bias.
When investigating confirmation bias in the world, take a look at theories that seem a little too good to be true.
Statistics and studies are trotted out to prove them; these are not very difficult to find, once you are convinced of the rightness of your argument. On the internet, it is easy to find studies that support both sides of an argument. In general, you should never accept the validity of people’s ideas because they have supplied “evidence.” Instead, examine the evidence yourself in the cold light of day, with as much skepticism as you can muster. Your first impulse should always be to find the evidence that disconfirms your most cherished beliefs and those of others. That is true science.
I believe in this idea so strongly. It must be true.
We hold on to an idea that is secretly pleasing to us, but deep inside we might have some doubts as to its truth, and so we go an extra mile to convince ourselves—to believe in it with great vehemence and to loudly contradict anyone who challenges us. How can our idea not be true if it brings out in us such energy to defend it, we tell ourselves? This bias is revealed even more clearly in our relationship to leaders—if they express an opinion with heated words and gestures, colorful metaphors and entertaining anecdotes, and a deep well of conviction, it must mean they have examined the idea carefully to express it with such certainty. Those, on the other hand, who express nuances, whose tone is more hesitant, reveal weakness and self-doubt. They are probably lying, or so we think. This bias makes us susceptible to salesmen and demagogues who display conviction as a way to convince and deceive. They know that people are hungry for entertainment, so they cloak their half-truths with dramatic effects.
I understand the people I deal with; I see them just as they are.
We see people not as they are, but as they appear to us. And these appearances are usually misleading. First, people have trained themselves in social situations to present the front that is appropriate and that will be judged positively. They seem to be in favor of the noblest causes, always presenting themselves as hardworking and conscientious. We take these masks for reality. Second, we are prone to fall for the halo ef ect—when we see certain negative or positive qualities in a person (social awkwardness, intelligence), other positive or negative qualities are implied that fit with this. People who are good-looking generally seem more trustworthy, particularly politicians. If a person is successful, we imagine they are probably also ethical, conscientious, and deserving of their good fortune. This obscures the fact that many people who have gotten ahead have done so through less-than-moral actions, which they cleverly disguise from view.
The Group Bias
My ideas are my own. I do not listen to the group. I am not a conformist.
We are social animals by nature. The feeling of isolation, of difference from the group, is depressing and terrifying. We experience tremendous relief when we find others who think the same way we do. In fact, we are motivated to take up ideas and opinions because they bring us this relief. We are unaware of this pull and so imagine we have come to certain ideas completely on our own. Look at people who support one party or the other, one ideology—a noticeable orthodoxy or correctness prevails, without anyone saying anything or applying overt pressure. If someone is on the right or the left, their opinions will almost always follow the same direction on dozens of issues, as if by magic, and yet few would ever admit this influence on their thought patterns.
The Blame Bias
I learn from my experience and mistakes.
Mistakes and failures elicit the need to explain. We want to learn the lesson and not repeat the experience. But in truth, we do not like to look too closely at what we did; our introspection is limited. Our natural response is to blame others, circumstances, or a momentary lapse of judgment. The reason for this bias is that it is often too painful to look at our mistakes. It calls into question our feelings of superiority. It pokes at our ego. We go through the motions, pretending to reflect on what we did. But with the passage of time, the pleasure principle rises and we forget what small part in the mistake we ascribed to ourselves. Desire and emotion will blind us yet again, and we will repeat exactly the same mistake and go through the same mild recriminating process, followed by forgetfulness, until we die.
If people truly learned from their experience, we would find few mistakes in the world and career paths that ascend ever upward.
I’m different. I’m more rational than others, more ethical as well.
Few would say this to people in conversation. It sounds arrogant. But in numerous opinion polls and studies, when asked to compare themselves with others, people generally express a variation of this. It’s the equivalent of an optical illusion—we cannot seem to see our faults and irrationalities, only those of others. So, for instance, we’ll easily believe that those in the other political party do not come to their opinions based on rational principles, but those on our side have done so. On the ethical front, few of us will ever admit that we have resorted to deception or manipulation in our work or have been clever and strategic in our career advancement. Everything we’ve got, or so we think, comes from natural talent and hard work. But with other people, we are quick to ascribe to them all kinds of Machiavellian tactics. This allows us to justify whatever we do, no matter the results.
We feel a tremendous pull to imagine ourselves as rational, decent, and ethical. These are qualities highly promoted in the culture. To show signs otherwise is to risk great disapproval. If all of this were true —if people were rational and morally superior—the world would be suffused with goodness and peace. We know, however, the reality, and so some people, perhaps all of us, are merely deceiving ourselves. Rationality and ethical qualities must be achieved through awareness and effort. They do not come naturally. They come through a maturation process.