Heuristics Theory: A Revolutionary View of Decision-Making

This article is an excerpt from the Shortform book guide to "The Undoing Project" by Michael Lewis. Shortform has the world's best summaries and analyses of books you should be reading.

Like this article? Sign up for a free trial here.

What’s heuristics theory? How was it developed? What are the three original heuristics?

Between 1969 and 1979, psychologists Daniel Kahneman and Amos Tversky overturned our understanding of how we make decisions. The lessons from that decade of research call into question much of what we think about how we think.

Read more to learn how Tversky and Kahneman revealed through heuristics theory that the mind relies on impressions over logic.

The Birth of Heuristics

Lewis relates the history of heuristics theory in his book The Undoing Project. In 1970, Tversky went to the Oregon Research Institute to further his work. Scientists there had begun to study how experts actually drew their conclusions, as opposed to how they claimed they did. Kahneman continued his research at home in Israel.

In America and Israel, the pair of psychologists presented students and children with a battery of oddball statistical problems. Lewis explains that they expected their students to get the answers wrong; what they wanted to study was how they got them wrong. After conducting these studies, Kahneman and Tversky reconvened in Oregon in 1971 to sort their data.

They found that the mind does not calculate statistics. Instead, it applies rules of thumb that Kahneman and Tversky dubbed “heuristics”—mental models that break down in problems that contain random elements. (Shortform note: Kahneman and Tversky discovered several distinct heuristics, including the “representativeness” heuristic (described in a paper on subjective probability) and the “availability” heuristic (defined in a paper on judging frequency and probability).)

Lewis asserts that, by studying the ways in which the mind fails, Tversky and Kahneman revealed how it works. They discovered that people, even experts in their fields, rely heavily on stereotypes and ignore statistics. (Shortform note: The way we let stereotypes shape our judgment is particularly pernicious when it comes to the subject of race. In Biased, Jennifer Eberhardt identifies unconscious racial stereotyping as a pathway to confirmation bias, in which the mind only accepts data that conforms to a predetermined narrative.)

Given how important their discoveries were, Kahneman and Tversky wanted to reach a wider audience than the limited circles of academic psychology. Lewis writes that Tversky in particular gave talks to other academic groups, to make experts more aware of their own cognitive biases.

(Shortform note: Reaching from one academic discipline to another is still considered unusual. The National Academy of Sciences reports that collaborations between economists and psychologists are rare because both fields have different methods and objectives. However, it’s becoming more commonplace—interdisciplinary research is a growing field, with psychology acting as a hub for other sciences that focus on specific aspects of human behavior.)

The Three Heuristics

Kahneman and Tversky determined that, when making judgments, the mind doesn’t unconsciously calculate statistics, as was the common belief of the time. Instead, the mind applies stories and stereotypes through processes that Tversky and Kahneman called heuristics. In short, our minds use a variety of shortcuts to make guesses when we don’t have enough information. 

In their research, they identified three separate heuristics that systematically cloud human judgment—representativeness, availability, and anchoring. (Shortform note: Heuristics have become a standard psychological tool for describing the mental shortcuts the brain takes when evaluating judgments and decisions. In Algorithms to Live By, Brian Christian and Tom Griffiths argue that you can train yourself to use heuristics derived from computer programming to make better decisions and optimize your time. Algorithms, they claim, let us make more efficient use of our limited memory and attention so we can avoid analysis paralysis and decision fatigue.)

Representativeness

The representativeness heuristic describes the way our minds persistently compare people and events to stereotypes and other assumptions based on past interactions. Lewis points out that, from an evolutionary perspective, this heuristic is a handy mental measure to speed up decision-making in crucial situations—such as determining whether that shadow up ahead is a panther about to attack you from a tree. However, Tversky and Kahneman showed that this heuristic breaks down when random elements are involved.

According to Kahneman and Tversky, the human mind has a fundamental misunderstanding of randomness, to the point that we concoct inaccurate beliefs to explain why random things happen. In particular, as Lewis explains, people find it hard to accept that randomness naturally generates clusters that look like patterns even when they’re not. Instead, we wrongly expect randomness to create an even spread. For example, we think flipping a coin will produce heads and tails in equal numbers, when nothing guarantees that that will be the case. To our minds, an even distribution is more “representative” of what we believe a random sample will produce, so we concoct erroneous stories to explain any coincidences that naturally occur.

(Shortform note: There’s nothing special about coincidence; randomness guarantees that it will occur. In The Improbability Principle, mathematician David Hand invokes the Law of Truly Large Numbers to explain that the sheer number of possibilities and opportunities in the world makes oddball occurrences statistically inevitable. They only seem significant because our minds demand a narrative to explain why they happen.)

Lewis says that this cognitive error is problematic in disciplines like psychology, social science, and even medicine, where research is performed on small sample groups that may not represent the larger population because of random factors in test group selection. (Shortform note: The smaller the sample size used in a study, the larger its margin of error. In a paper published in 2005, medical professor John Ioannidis asserts that most current research findings are flawed due to the statistical limitations of the studies they’re based on.)

Availability

Availability, the second heuristic, states that you will consider any given scenario more likely if you can easily recall a similar situation. Lewis states that this heuristic makes us draw conclusions based on common occurrences, recent events, or anything that’s heavy on our minds. For example, after watching a movie about a serial killer, you may suddenly be afraid of becoming a victim yourself, even though the actual likelihood did not change after seeing the film. (Shortform note: While this effect may seem innocuous, it can have a negative impact on decision-making, such as when a manager overlooks an employee’s good record in favor of one recent mistake. It can also increase your anxiety by making you dwell on unlikely events.)

Lewis says that, as with representativeness, the availability heuristic makes sense from an evolutionary standpoint—scenarios that occur more frequently may indeed be more likely than others. However, on the societal level, this heuristic leads to self-reinforcing systemic bias. For an individual, it can trick you into drawing poor conclusions when the proper evidence isn’t readily available, but misleading information is.

(Shortform note: In Biased, Jennifer Eberhardt goes into detail about the self-reinforcing societal impact of both the availability and representativeness heuristics, though she does not cite the heuristics by name. Availability comes into play when negative depictions of Black people in media create more “available” memories from which people make associations. Representativeness comes into play in what Everhardt dubs the other-race effect, in which people judge individuals based on preconceptions about the group they belong to.)

Anchoring

Anchoring, the third heuristic, is a phenomenon related to how the mind deals with numbers. Tversky and Kahneman found that, when asking test subjects to estimate numbers, their guesses could be manipulated by “priming” their subjects with irrelevant information. 

For example, if students were told, “There are 14 lines in a sonnet,” then asked to guess how many countries there are in Africa, their answers would tend to be low. If another group were told, “There are 5,000 students enrolled in this college,” their guesses to the number of countries would be high. (The correct answer as of 2022 is 54.) Lewis claims that Kahneman and Tversky weren’t able to identify why the brain behaves like this, but the fact that it does reveals another way that the mind is vulnerable to error.

How Numbers Fool the Mind

Kahneman elaborates on the anchoring effect in Thinking, Fast and Slow, where he identifies two mechanisms that cause it. First, the mind uses the anchor number as an initial guess from which to adjust. Second, the mind makes an association with the anchor—high or low, big or small, long or short—that colors the narrative within which the guess is made.

Others have noted that the anchoring effect is of particular use to marketers, who influence customers’ perceptions about price by anchoring their impressions about how much a product should cost. Some research has suggested that anchoring affects moral judgments as well, such as when one person’s ethical opinion is used as an anchor for those of others.

Exercise: How Strong Are Your Heuristics?

Tversky and Kahneman developed the theory that our minds use a variety of shortcuts to make guesses when we don’t have all the information. How good of a guesser are you? The answers to these questions will be listed below. (Don’t peek.)

  • Texas, the largest state in the contiguous US, is divided into 254 counties. Brazil is the largest country in South America. Without looking it up, estimate the number of separate states that make up the country of Brazil.
  • The United States suffers the second-highest number of natural disasters in the world (after China). Guess what percentage of insured homes make claims for damage in the U.S. per year.

Brazil is divided into 26 states. If you guessed significantly higher, you may have fallen prey to the anchoring heuristic. In what specific examples from your life did one “anchored” value, such as the price of a purchase, influence your perception of something else’s value?

Only 6% of insured homeowners make claims in the U.S. per year, and not all of those are due to natural causes. If your guess was significantly higher, you may have been affected by the availability heuristic. In this case, you were cued to think about natural disasters, which may have made damage to homes seem more probable. What instances have occurred in your life when something weighing on your mind made it feel more likely?

Heuristics Theory: A Revolutionary View of Decision-Making

———End of Preview———

Like what you just read? Read the rest of the world's best book summary and analysis of Michael Lewis's "The Undoing Project" at Shortform.

Here's what you'll find in our full The Undoing Project summary:

  • The mental errors that skew your daily decisions
  • How to avoid the unconscious pitfalls of the mind
  • How Daniel Kahneman and Amos Tversky formed their famous partnership

Elizabeth Whitworth

Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books—and a classic murder mystery now and then. Elizabeth has a blog and is writing a book about the beginning and the end of suffering.

Leave a Reply

Your email address will not be published.