A man pondering over heuristics and biases

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here .

How can you make good decisions using such flawed methods of thinking? In Thinking, Fast and Slow, Daniel Kahneman’s solution is to learn to recognize situations when System 1 is vulnerable to mistakes, so you can bring your rational System 2 to bear. 

To that end, we’ll briefly review a number of common heuristics (mental shortcuts) and cognitive biases (thinking errors that heuristics can lead to) that can mislead System 1 thinking. Having a good understanding of heuristics and biases can help you think better and wiser.

Originally Published: November 3, 2019
Last Updated: December 12, 2025

Heuristics: Mental Shortcuts

Heuristics and biases are crucial to thinking. Kahneman explains that heuristics are mental tools by which System 1, when faced with a complex question, will substitute an easy question and will answer that instead. The simple answers System 1 produces are usually imperfect but are adequate for most everyday situations. However, when you’re facing a complex situation with high stakes, these simple answers can mislead you.

For example, a difficult but low-stakes question might be “What should I order for dinner?” To find the best answer, you’d have to consider many different factors, ranging from cost and nutritional value to personal preference. Therefore, you’ll probably substitute the much simpler heuristic question “What do I feel like eating?” This is easier to answer because it only relies on an emotional impulse, rather than multiple factors. 

In contrast, a high-stakes question might be “Is this company worth investing in?” To answer this question, you should think logically about how likely that company is to be successful, and therefore how likely it is that its stock price will increase. However, System 1 might substitute a heuristic question such as “Do I like this company?” In this situation, you risk losing money—potentially a lot of money—because you made an important decision based on your feelings toward a company rather than an honest assessment of its performance.

System 2 has the capacity to recognize and reject these heuristic answers, but it will often endorse them without further scrutiny. Therefore, Kahneman urges you to make sure you actually answered the questions you think you answered before you make important decisions. In other words, consider whether you came to your decision through difficult, rational thought or if you fell back on simplistic and impulsive ideas.

The Birth of Heuristics

Kahneman, working with fellow psychologist Amos Tversky, developed this concept of heuristics in the late 1960s through the early 1970s. In The Undoing Project, biographer Michael Lewis explains how the two collaborated on a research project that eventually formed the foundation of this “heuristic” theory of decision-making

Kahneman and Tversky set out to determine whether people make guesses and predictions based on statistics and reason (as Tversky believed) or through some other method. To study this, they devised a series of questions that could be solved through mathematical logic, but intentionally wrote them in ways that would mislead people. The two psychologists fully expected people to get the questions wrong, but were interested to see specifically how and why people got them wrong. 

By studying the results of these tests, Kahneman and Tversky identified a number of different logical flaws and fallacies. Of particular note were the common belief in a “Law of Small Numbers”—people assume that a small set of data accurately represents a larger population—and the fact that even experts in their fields tended to rely on stereotypes and preconceived notions, rather than on facts and statistics.

Common Cognitive Biases and Fallacies

Kahneman also warns that, in addition to substituting simpler questions for difficult ones, System 1 thinking is prone to numerous biases that cause you to misinterpret information. As a result, even when you try to think rationally about your decisions, you may be influenced by errors in thinking that lead to mistakes. 

Some of the most common cognitive biases include:

Confirmation bias: People tend to selectively pay attention to information that confirms what they already believe and discard information that doesn’t. Similarly, people often interpret things in ways that support their existing beliefs. For example, a person with conservative views is prone to interpreting crime data in a way that supports their beliefs on gun control—while a person with progressive views will interpret the exact same data in a way that supports their completely opposing gun-control beliefs.

(Shortform note: Information that challenges our beliefs feels threatening, and therefore confirmation bias serves as a defense mechanism against such information. In Awaken the Giant Within, life coach Tony Robbins explains that our beliefs are key parts of our identities. As a result, the thought that a deeply-held belief could be wrong feels like a personal attack—not just that we’re wrong about something, but that something about us is fundamentally wrong.)

Availability bias: Kahneman writes that people pay more attention to, and then attach more emotions to, events they’ve recently heard about—in other words, events that are more available to their current, working memories. When events are fresher in someone’s memory, that person often believes those events are more common than they actually are. Meanwhile, they’ll ignore issues that may be statistically more likely to happen but haven’t been brought to their attention lately.  

For example, someone living in a city might frequently hear news stories about crime among homeless people and might come to believe they’re in great danger every time they leave their apartment. They’ll believe this even if they’re statistically unlikely to be affected by such crime. That same person might be far more likely to be hit by a delivery bike, but since those accidents rarely make the news, the person doesn’t think about them, doesn’t feel they’re common, and isn’t afraid of them.

(Shortform note: The availability bias is closely related to the salience bias, where we notice things that stand out because of their vividness or emotional impact. Both these biases are, in turn, correlated with the negativity bias, by which we pay more attention to negative information than positive. These three biases work together to create a loop: When people talk about shocking events because of the salience and negativity biases, those events become more available to our recent memories, which makes us focus on them more, leading us to again talk about them more. This loop is why people are often more afraid of (rare but shocking) shark attacks rather than (far more common but less dramatic) car accidents.) 

Anchoring bias: When shown an initial piece of information, people’s decisions will be skewed toward that information, even if it’s irrelevant to the decision at hand. For instance, in one study, when a nonprofit requested $400, the average donation was $143; when it requested $5, the average donation was $20. In this example, the suggested donation influenced how much people actually donated, even though it wasn’t really relevant—people were free to donate as much or as little as they wanted to.

(Shortform note: Anchoring bias may result in part from the phenomenon of psychological priming. When we are exposed to an idea, that idea activates, or primes, parts of our brain, and those parts of the brain stay active as we process further information. This can affect our thinking by anchoring us to the first idea we heard and to the mental connections we drew from it. To avoid anchoring bias, actively try to think of counterarguments or alternative options, and look for reasons why they might be better than the anchored information.) 

Narrative fallacy: Kahneman says that people try to create coherent stories to explain random events. Then, because their stories sound plausible, people feel an unjustified level of confidence in their ability to predict future events. For example, two boxers might be so evenly matched that the outcome of their match could go either direction. However, sports pundits discussing the match will invent stories about how the loser buckled under the pressure, or the winner “wanted it more.” If those boxers were to have a rematch, the pundits will try to anticipate the winner based on the stories they previously created, even though the outcome will be just as unpredictable as before.

(Shortform note: The narrative fallacy comes from the natural human desire to understand and control, or at least predict, the world around you. For example, psychologists believe that conspiracy theories—extreme narrative fallacies that draw connections between totally unrelated events—are actually self-soothing anxiety responses. If there’s a group secretly orchestrating events (for instance, the Illuminati are popular scapegoats), that means nothing is ever truly random; therefore, any future disaster can be predicted and prepared for. While belief in a global, nearly omnipotent group like the Illuminati might seem terrifying, some people find comfort in having a tangible enemy to fight, rather than being at the mercy of random chance.)

Narrow framing: According to Kahneman, people tend to make decisions based on relatively small pools of information instead of considering the whole picture. There are numerous ways this fallacy can manifest. For example, in the planning fallacy, people tend to overlook all the ways a project could go wrong, and therefore underestimate how much time it will require—they only factor in information about how long it will take in an ideal situation. Another example is the sunk cost fallacy, which happens when people narrow their focus to getting back what they’ve lost on a failed endeavor. However, if they consider all of their options, they’ll see it’s better to cut their losses and invest their resources elsewhere.

(Shortform note: Narrow framing may be the unavoidable result of working memory’s natural limitations. Your working memory—where your brain stores ideas that you’re currently using to solve a problem or make a decision—can only hold a few pieces of information at once. Researchers disagree on exactly how many ideas you can hold at once, and it varies from person to person, but common estimates place the average working memory capacity at somewhere from two to four ideas. As a result, regardless of whether you’re using System 1 or System 2 thinking, it’s simply not possible to consider everything when making a decision.)  

Heuristics and Biases: Bad Thinking and Bad Decisions

———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .

Here's what you'll find in our full Thinking, Fast and Slow summary :

  • Why we get easily fooled when we're stressed and preoccupied
  • Why we tend to overestimate the likelihood of good things happening (like the lottery)
  • How to protect yourself from making bad decisions and from scam artists

Katie Doll

Somehow, Katie was able to pull off her childhood dream of creating a career around books after graduating with a degree in English and a concentration in Creative Writing. Her preferred genre of books has changed drastically over the years, from fantasy/dystopian young-adult to moving novels and non-fiction books on the human experience. Katie especially enjoys reading and writing about all things television, good and bad.

Leave a Reply

Your email address will not be published. Required fields are marked *