

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.
Like this article? Sign up for a free trial here .
What is availability bias? When does it occur, and how can you avoid it?
Availability bias is the tendency to place more importance on information we can easily remember. The more easily you remember something, the more significant you perceive what you’re remembering to be. In contrast, things that are hard to remember are lowered in significance.
Learn how the availability bias, also known as the availability heuristic in psychology, hurts our thinking skills. We’ll cover the role of availability bias in the media and what you can do to overcome availability bias.
Availability Heuristic Bias
When trying to answer the question “what do I think about X?,” you actually tend to think about the easier but misleading questions, “what do I remember about X, and how easily do I remember it?” This is the availability bias at work.
More quantitatively, when trying to estimate the size of a category or the frequency of an event, you instead use the availability heuristic: how easily do the instances come to mind? Whatever comes to your mind more easily is weighted as more important or true. This is the availability bias.
Using the availability heuristic means a few things:
- Items that are easier to recall take on greater weight than they should.
- When estimating the size of a category, like “dangerous animals,” if it’s easy to retrieve items for a category, you’ll judge the category to be large.
- When estimating the frequency of an event, if it’s easy to think of examples, you’ll perceive the event to be more frequent.
How Availability Bias Manifests
In practice, the availability bias manifests in a number of ways:
- Events that trigger stronger emotions (like terrorist attacks) are more readily available than events that don’t (like diabetes), causing you to overestimate the importance of the more provocative events.
- More recent events are more available than past events, and are therefore judged to be more important.
- More vivid, visual examples are more available than mere words. For instance, it’s easier to remember the details of a painting than it is to remember the details of a passage of text. Consequently, we often value visual information over verbal.
- Personal experiences are more available than statistics or data.
- Famously, spouses were asked for their % contribution to household tasks. When you add both spouses’ answers, the total tends to be more than 100% – for instance, each spouse believes they contribute 70% of household tasks. Because of availability bias, one spouse primarily sees the work they had done and not their spouse’s contribution, and so each spouse believes they contributed unequally more.
- Items that are covered more in popular media take on a greater perceived importance than those that aren’t, even if the topics that aren’t covered have more practical importance.
Availability bias also tends to influence us to weigh small risks as too large. Parents who are anxiously waiting for their teenage child to come home at night are obsessing over the fears that are readily available to their minds, rather than the realistic, low chance that the child is actually in danger.
Availability Bias and the Media
Within the media, availability bias can cause a vicious cycle where something minor gets blown out of proportion:
- A minor curious event is reported. A group of people overreacts to the news.
- News about the overreaction triggers more attention and coverage of the event. Since media companies make money from reporting worrying news, they hop on the bandwagon and make it an item of constant news coverage.
- This continues snowballing as increasingly more people see this as a crisis.
- Naysayers who say the event is not a big deal are rejected as participating in a coverup.
- Eventually, all of this can affect real policy, where scarce resources are used to solve an overreaction rather than a quantitatively more important problem.
In Thinking, Fast and Slow, Kahneman cites the example of New York’s Love Canal in the 1970s, where buried toxic waste polluted a water well. Residents were outraged, and the media seized on the story, claiming it was a disaster. Eventually legislation was passed that mandated the expensive cleanup of toxic sites. Kahneman argues that the pollution has not been shown to have any actual health effects, and the money could have been spent on far more worthwhile causes to save more lives.
———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .
Here's what you'll find in our full Thinking, Fast and Slow summary :
- Why we get easily fooled when we're stressed and preoccupied
- Why we tend to overestimate the likelihood of good things happening (like the lottery)
- How to protect yourself from making bad decisions and from scam artists