A diagram of a brain that shows cognitive bias examples

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here .

Isn’t it profound how we can make decisions without realizing it? You like or dislike people before you know much about them; you feel a company will succeed or fail without really analyzing it. But how susceptible are these quick judgments to cognitive bias? What is cognitive bias?

Cognitive bias is an error in thinking that affects our judgments. These biases are the result of quick, intuitive thinking below the conscious level. Learn more about what they are with common cognitive biases examples below.

Originally Published: November 12, 2019
Last Updated: December 11, 2025

Common Cognitive Biases and Fallacies

In Thinking, Fast and Slow, Daniel Kahneman warns that, in addition to substituting simpler questions for difficult ones, System 1 thinking is prone to numerous biases that cause you to misinterpret information. As a result, even when you try to think rationally about your decisions, you may be influenced by errors in thinking that lead to mistakes. 

Some of the most common cognitive biases include:

  1. Confirmation bias
  2. Availability bias
  3. Anchoring bias
  4. Narrative fallacy
  5. Narrow framing

Let’s explore these five cognitive biases examples below.

Confirmation bias: People tend to selectively pay attention to information that confirms what they already believe and discard information that doesn’t. Similarly, people often interpret things in ways that support their existing beliefs. For example, a person with conservative views is prone to interpreting crime data in a way that supports their beliefs on gun control—while a person with progressive views will interpret the exact same data in a way that supports their completely opposing gun-control beliefs.

(Shortform note: Information that challenges our beliefs feels threatening, and therefore confirmation bias serves as a defense mechanism against such information. In Awaken the Giant Within, life coach Tony Robbins explains that our beliefs are key parts of our identities. As a result, the thought that a deeply-held belief could be wrong feels like a personal attack—not just that we’re wrong about something, but that something about us is fundamentally wrong.)

Availability bias: Kahneman writes that people pay more attention to, and then attach more emotions to, events they’ve recently heard about—in other words, events that are more available to their current, working memories. When events are fresher in someone’s memory, that person often believes those events are more common than they actually are. Meanwhile, they’ll ignore issues that may be statistically more likely to happen but haven’t been brought to their attention lately.  

For example, someone living in a city might frequently hear news stories about crime among homeless people and might come to believe they’re in great danger every time they leave their apartment. They’ll believe this even if they’re statistically unlikely to be affected by such crime. That same person might be far more likely to be hit by a delivery bike, but since those accidents rarely make the news, the person doesn’t think about them, doesn’t feel they’re common, and isn’t afraid of them.

(Shortform note: The availability bias is closely related to the salience bias, where we notice things that stand out because of their vividness or emotional impact. Both these biases are, in turn, correlated with the negativity bias, by which we pay more attention to negative information than positive. These three biases work together to create a loop: When people talk about shocking events because of the salience and negativity biases, those events become more available to our recent memories, which makes us focus on them more, leading us to again talk about them more. This loop is why people are often more afraid of (rare but shocking) shark attacks rather than (far more common but less dramatic) car accidents.) 

Anchoring bias: When shown an initial piece of information, people’s decisions will be skewed toward that information, even if it’s irrelevant to the decision at hand. For instance, in one study, when a nonprofit requested $400, the average donation was $143; when it requested $5, the average donation was $20. In this example, the suggested donation influenced how much people actually donated, even though it wasn’t really relevant—people were free to donate as much or as little as they wanted to.

(Shortform note: Anchoring bias may result in part from the phenomenon of psychological priming. When we are exposed to an idea, that idea activates, or primes, parts of our brain, and those parts of the brain stay active as we process further information. This can affect our thinking by anchoring us to the first idea we heard and to the mental connections we drew from it. To avoid anchoring bias, actively try to think of counterarguments or alternative options, and look for reasons why they might be better than the anchored information.) 

Narrative fallacy: Kahneman says that people try to create coherent stories to explain random events. Then, because their stories sound plausible, people feel an unjustified level of confidence in their ability to predict future events. For example, two boxers might be so evenly matched that the outcome of their match could go either direction. However, sports pundits discussing the match will invent stories about how the loser buckled under the pressure, or the winner “wanted it more.” If those boxers were to have a rematch, the pundits will try to anticipate the winner based on the stories they previously created, even though the outcome will be just as unpredictable as before.

(Shortform note: The narrative fallacy comes from the natural human desire to understand and control, or at least predict, the world around you. For example, psychologists believe that conspiracy theories—extreme narrative fallacies that draw connections between totally unrelated events—are actually self-soothing anxiety responses. If there’s a group secretly orchestrating events (for instance, the Illuminati are popular scapegoats), that means nothing is ever truly random; therefore, any future disaster can be predicted and prepared for. While belief in a global, nearly omnipotent group like the Illuminati might seem terrifying, some people find comfort in having a tangible enemy to fight, rather than being at the mercy of random chance.)

Narrow framing: According to Kahneman, people tend to make decisions based on relatively small pools of information instead of considering the whole picture. There are numerous ways this fallacy can manifest. For example, in the planning fallacy, people tend to overlook all the ways a project could go wrong, and therefore underestimate how much time it will require—they only factor in information about how long it will take in an ideal situation. Another example is the sunk cost fallacy, which happens when people narrow their focus to getting back what they’ve lost on a failed endeavor. However, if they consider all of their options, they’ll see it’s better to cut their losses and invest their resources elsewhere.

(Shortform note: Narrow framing may be the unavoidable result of working memory’s natural limitations. Your working memory—where your brain stores ideas that you’re currently using to solve a problem or make a decision—can only hold a few pieces of information at once. Researchers disagree on exactly how many ideas you can hold at once, and it varies from person to person, but common estimates place the average working memory capacity at somewhere from two to four ideas. As a result, regardless of whether you’re using System 1 or System 2 thinking, it’s simply not possible to consider everything when making a decision.)  

The Top 5 Cognitive Bias Examples, Explained

———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .

Here's what you'll find in our full Thinking, Fast and Slow summary :

  • Why we get easily fooled when we're stressed and preoccupied
  • Why we tend to overestimate the likelihood of good things happening (like the lottery)
  • How to protect yourself from making bad decisions and from scam artists

Katie Doll

Somehow, Katie was able to pull off her childhood dream of creating a career around books after graduating with a degree in English and a concentration in Creative Writing. Her preferred genre of books has changed drastically over the years, from fantasy/dystopian young-adult to moving novels and non-fiction books on the human experience. Katie especially enjoys reading and writing about all things television, good and bad.

Leave a Reply

Your email address will not be published. Required fields are marked *