Are humans more rational or emotional in their decision-making? Do you think you tend to make decisions based more on logic or emotion?
Many people would say that their decisions are ruled mostly by reason and logic. But the truth is most people actually make decisions more irrationally than they realize. We are all subject to biases and cognitive heuristics and pulled by our emotions. We make decisions first, and rationalize them after the fact.
In this article, we’ll take a look at some of the most common cognitive heuristics and demonstrate how they manifest with real-world examples.
What Are Cognitive Biases?
Cognitive heuristics may exist because they save the brain energy—the thoughts are not perfect but are good enough for survival. It would be exhausting to reinterpret your reality with every new piece of information.
The important principle here is that biases are not flaws in our operating system—they are the operating system.
Funnily enough, even when you know exactly what’s going on, it’s still effective.
- Everyone knows that $9.99 is chosen because it looks much cheaper than $10.00. It still works.
- The McGurk effect (video of someone saying “bah” with audio of “fah”). Even when you know the effect, you are still subject to it.
What it is: you pay attention to information that confirms your prior beliefs, and discard data that contradicts your beliefs.
In politics, all sides suffer from this.
- There was alleged collusion between Russia and the Trump campaign. No smoking gun was reported, but there were stories of Russian influence on the election (eg Facebook). Anti-Trumpers say there is so much smoke that collusion must have happened. Trump supporters say that without evidence, Trump has been proven innocent.
- (Shortform example: watch different TV channels for liberals and conservatives and you’ll rarely see the same top stories at once. A scandal for a conservative politician will be the rage on the liberal channel and totally absent from the conservative channel, and vice versa.)
When trying to persuade, you might think that facts alone can win the day. But people will just filter the information for whatever confirms their current beliefs. People often don’t change opinions just because they see some information that discredited their opinion.
What it is: When people perform actions that are inconsistent with their underlying beliefs, they rationalize the action in the context of their beliefs, often forming delusions.
- (Shortform example: one classic experiment showed that people who were paid nothing for a tedious task enjoyed it more than people who were paid more for the same task. The ones who were paid nothing had to reason subconsciously, “well this task is boring. But clearly I’m not doing it for money since I’m not getting paid. So maybe I enjoy it more than I thought I did?”)
- If you believe you’re an honest person but you do something dishonest, you rationalize the action as justified in a tortuous way.
According to Scott Adams, a “tell” for cognitive dissonance is the absurdity of the rationalization, and how many there are.
- Someone claims smoking won’t hurt him because a person smoked a pack a day and lived to be 100. This is a personal illusion where he is one of the few people alive who is immune to lung cancer.
Another tell is responding with an absurd absolute position, combined with a personal insult. This person doesn’t have a rational reason for their views, forming a dissonance in their mind that they resolve by discrediting the other person’s viewpoint.
- Someone expresses a view in favor of gun ownership. In return, someone says, “ha, so I guess you want to give guns to toddlers!”
What it is: People don’t want to change their minds. If you attack a person’s belief, that person will double down and entrench, rationalizing it along the way.
Recency Bias, Availability Bias
What it is: You tend to overweight information that you thought about recently, or that is more available to you.
Be wary when someone is repeatedly pressing your button to get you to return to an issue.
Two Movies on One Screen / Filters
What it is: People see differently realities. Given the exact same set of data, two people with two different reality filters will see two different things. Scott Adams calls this “one picture, two movies.
- Two people can view the same Presidential speech and have wildly different conclusions from it.
- Using mind-altering substances (even alcohol or caffeine will do) gives you a different experience of the world. You yourself can experience different realities just by going from normal to high.
Beware of selecting inaccurate filters. It’s easy to fit completely different explanations to the same set of facts. To be useful, the interpretation must be able to predict accurately.
People often assume these are rare, but Scott Adams argues mass delusions are the norm, and it’s the rare time when a population is behaving rationally.
Mass delusions are Often due to a combination of biases, including social proof, confirmation bias, loss aversion.
- Stock market booms and busts
- News reports that exaggerate the prevalence of a problem.
- Orson Welles’s broadcast of War of the Worlds. A small portion of the country who heard it believed an alien invasion was occurring. Then, it became a folk myth that much of the country had been fooled.
Delusions can occur when there are 1) complicated prediction models with lots of assumptions, and 2) financial and psychological pressure to agree with the consensus. The mass delusion then continues a vicious cycle of swallowing more people.
- Scott Adams argues climate change has these tinges – climate prediction models have complications with room for bias. Tweak the assumptions and you get any outcome you want.
- (Shortform note: In medicine the consensus held that stomach ulcers were caused by stomach acid. Barry Marshall believed it was caused by a bacterium, but consensus ruled that bacteria could not survive stomach acid. He drank a vial of the bacteria and got ulcers.)
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Scott Adams's "Win Bigly" at Shortform.
Here's what you'll find in our full Win Bigly summary:
- The persuasion tactics Donald Trump used throughout the 2016 presidential campaign
- Why Hillary Clinton's campaign fell short
- How to leverage people’s biases and irrationality to persuade on your point of view