This article gives you a glimpse of what you can learn with Shortform. Shortform has the world’s best guides to 1000+ nonfiction books, plus other resources to help you accelerate your learning.
Want to learn faster and get smarter? Sign up for a free trial here .
Why do we feel compelled to eat the entire serving of lasagna? Why do we like the paint brand that has our favorite baseball player in its commercials? Why do we remember our first kiss better than what happened last week?
We’re all prone to cognitive biases—quirks of the mind that distort reality in one way or another. These biases don’t adhere to logic or rational thinking. Our faulty perception gets in the way. We don’t see things as they truly are, and we often act on these misperceptions. We all can benefit from recognizing and understanding these biases.
Here’s a list of cognitive biases that are fairly common, along with descriptions and examples.
List of Cognitive Biases
Cognitive biases create problems for ourselves and society in general. While we can’t become immune to them, we can benefit from recognizing them in ourselves and others. Here, we’ve included a list of some of the most common cognitive biases, organized into three categories:
- Biases related to belief formation, decision-making, and behavior
- Biases related to social interaction
- Biases related to memory
Cognitive Biases Related to Belief Formation, Decision-Making, and Behavior
Some cognitive biases are associated with the way we form beliefs, the way we make decisions, and how we behave. Here’s a list of cognitive biases in this category.
Action bias is the tendency to take action rather than waiting for a better opportunity or more information. The Art of Thinking Clearly explains that, for early humans, delaying action could mean death (for instance, if a predator approached). While this is much less often the case for modern humans, we still instinctively want to act in all situations. However, this instinct causes problems in today’s complex world, when impulsive actions are more likely to cause difficulties than waiting and thinking things through.
Dennis voted for a candidate because he saw her signs all over the neighborhood, but he didn’t research her positions on the issues.
Anchoring describes the bias where we depend too heavily on an initial piece of information when making decisions. The bias causes us to favor that initial information, even if it’s irrelevant to the decision at hand.
Thinking, Fast and Slow explains that, when we are exposed to a particular number and then asked to estimate an unknown quantity, the initial number affects our estimate of the unknown quantity. Surprisingly, this happens even when the number has no meaningful relevance to the quantity to be estimated.
A nonprofit asks for different amounts of donations in its requests. When it asks for $400, the average donation is $143. When requesting $5, the average donation is $20.
Next on our list of cognitive biases is attribute substitution. This is a “bait and switch” maneuver in which we unconsciously substitute an easy question for a harder one.
The book Superforecasting shares the story of Scottish physician and professor Archie Cochrane, who was known as the “father of evidence-based medicine.” His beliefs were put to the test in 1956 when he had a small cancerous lesion removed from his hand. Although this type of cancer typically didn’t spread, he was referred to a specialist, who found a lump under Cochrane’s arm and recommended surgery to remove it. Trusting the specialist, Cochrane agreed to the surgery.
Afterward, the specialist delivered bad news: He’d found cancerous tissue extending all the way into the chest and had removed an entire muscle. Cochrane later found out from the delayed pathologist’s report that the tissue in his arm and chest had never been cancerous at all. He lost an entire muscle in his chest and faced his own mortality, all without any empirical evidence of his supposed disease.
In Cochrane’s case, the question “Do I have cancer?” was impossible to answer without more data. But the question “Is it likely that this specialist knows whether or not I have cancer?” was much easier—and a resounding “Yes,” given the circumstances. Most of the time, we don’t realize we’re making this substitution, which explains why Cochrane was so quick to accept his false-positive diagnosis.
The availability heuristic (also known as the availability bias and the availability-misweighing tendency) was originally identified by psychologists Amos Tversky and Daniel Kahneman, as described in Kahneman’s book Thinking, Fast and Slow. It was first studied when researchers found that, when asked if there were more words in English that started with the letter “k” or had “k” as the third letter, subjects incorrectly guessed the former, as words that began with “k” came to mind more quickly and easily. Whatever comes to our mind more easily is weighted as more important or true.
Poor Charlie’s Almanack describes the availability heuristic as basically thinking that what we see is all there is. Our brain works with what’s available to us and doesn’t think as hard about what’s missing. We have a limited capacity to remember, recall, and think—so we jump to what’s easily available. By using only what’s recently available, we ignore other important data that would have helped us make a better decision.
The book Superforecasting uses the example of hearing the snap of a twig on the savannah. If it brings to mind a memory we have of a lion pouncing on its prey, we automatically conclude that a lion must be the source of the sound. If we’ve never witnessed or heard of a lion attack, we’ll interpret the sound differently. The availability heuristic plays out unconsciously, in fractions of a second.
Next on our list of cognitive biases is the better-than-average effect. We sometimes believe that we’re above average in every way and therefore better than everyone around us. This can lead to poor choices about the future.
Celia thinks she’s a better swimmer than her friends are. She heads into waters they claim are dangerous because she thinks she can manage better than they can—even though this poses a threat to her safety.
Cognitive dissonance occurs when we hold thoughts or beliefs that oppose or contradict our decisions or attitudes. The notion of cognitive dissonance originates from a 1957 theory described by Leon Festinger, who claimed that we have an innate drive to maintain harmony among our thoughts, beliefs, and behaviors.
When we face a decision, every option has pros and cons, so we always lose out on something. We tend to overcome the dissonance by rationalizing. While this may seem irrational, it creates cognitive consistency, which is rational in that it reduces the anxiety of dissonance and helps us get on with our life.
Black Box Thinking explains that nobody likes to be wrong—it’s a threat to our egos. So, to defend our versions of events, our brains distort the information to conform to our beliefs. And the more invested we are, the worse the distortions.
Ben Franklin gave advice on what to do with people who don’t like us: ask them a favor. If they do us the favor, they resolve the cognitive dissonance by believing they liked us more than they previously thought.
When we suffer from confirmation bias, we retain information that reinforces our underlying beliefs or desired conclusions while ignoring contradicting evidence. The Art of Thinking Clearly warns that everyone has a tendency toward confirmation bias and explains that some people manipulate others’ natural tendency toward confirmation bias.
Fortune tellers give vague statements, trusting that our confirmation bias will latch onto the most fitting interpretation and reinforce our belief.
Curse of Knowledge
Next on our list of cognitive biases is the curse of knowledge. This bias comes into play when experts forget what being a novice was like and instead speak in abstractions. The experts wrongly assume their listeners have the same level of knowledge or expertise as they do.
Make It Stick explains that, the more proficient we are in a particular skill or subject area, the more ingrained our mental models become. The more ingrained our mental models, the harder it is to break them down into individual steps in order to teach someone else—and the more likely we are to underestimate how long it will take someone to learn the skill.
Made to Stick provides an example of the curse of knowledge in action. In 1990, Stanford Ph.D. student Elizabeth Newton asked subjects to choose a simple tune, such as “Jingle Bells,” from a list and tap out the rhythm on a table. She assigned other people to try to figure out the tune being tapped. Tappers were shocked to learn that listeners guessed the song correctly only 2.5 percent of the time. Because they were hearing the song in their heads as they tapped (knowledge the listener didn’t possess), they thought their tapping was making the song perfectly clear. Because they had the curse-of-knowledge bias, they couldn’t imagine the perspective of the listener who wasn’t “hearing” the same song. Once we know something, it’s hard to remember that others don’t.
In the book Range, David Epstein makes the case for generalism: a broad competence in many areas rather than the extreme mastery of one. He points out that studies show that people vastly underestimate how much they’re likely to change over the next several years—a phenomenon known as the “end-of-history illusion”—so they often overcommit to plans they make at a young age. Epstein states that our personalities change the most between the ages of 18 and 30, yet this is when many people decide what they’re going to do for the rest of their lives.
The Psychology of Money explains that most people fall victim to the end-of-history illusion when making financial plans. We recognize that we’ve changed significantly from who we were, but we don’t expect to change significantly from who we are now. In reality, though, we’re likely to change just as much in the future as we did in the past. To develop a long-term strategy we can follow over decades, we should expect our future goals to change.
Curtis changed his career twice between the ages of 20 and 30. He recognizes that he has changed significantly since 20, but he expects that he won’t change his career again for the rest of his life.
Next on our list of cognitive biases is the framing effect. This is when information is presented in different ways to achieve a desired perception. We respond differently to identical choices based solely on how they’re presented. Common frames include avoiding risk, achieving a goal, and appealing to emotions.
Hooked: How to Build Habit-Forming Products talks about a time when violin master Joshua Bell played in a DC subway station. He was summarily ignored because the context of the subway station made the idea of a virtuoso playing there improbable.
Hindsight bias makes past events seem like they should’ve been easily predictable. The Art of Thinking Clearly explains that we see an obvious pattern of circumstances that led to a past event occurring, and we think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.
Thinking, Fast and Slow mentions that many people believed, after the fact, that they foresaw the 2000 dot-com bubble bursting or the 2008 financial crisis happening.
Mere Exposure Effect
Why do we tend to dislike things that are unfamiliar? This can be partly explained by what’s known as the “mere exposure effect” or the “familiarity principle.” The mere exposure effect occurs when we start to like things just because we’ve been exposed to them before.
Jordan likes a song he hears on the radio because he’s heard it a few times before.
Next on our list of cognitive biases is the omission bias. When reflecting on the recent past, we regret choices we made instead of those we failed to make. The opposite is also true—when reflecting on the distant past, we tend to regret choices we failed to make instead of those we made.
The Art of Thinking Clearly explains that, when both acting and not acting have negative results, we’re prone to not acting. This bias causes problems when acting could at least mitigate the negative results.
Morgan receives a job offer. Both accepting the offer and not accepting the offer will cause her stress. Even though taking the job will cause her less stress, the omission bias leads her to turn it down.
Related to hindsight bias, outcome bias is the tendency to evaluate the quality of a decision when the outcome is already known. People who succeeded are assumed to have made better decisions than people who failed.
After the September 11 attacks, the U.S. government was criticized for ignoring information about al-Qaeda in July 2001. But this ignores how little they could predict that the attacks would result from that piece of information.
Parkinson’s Law of Triviality
The tendency to focus on the least important aspects of an important task or problem is known as Parkinson’s Law of Triviality. It’s also called bikeshedding.
The term “bikeshedding” comes from an instance where a group tasked to come up with a plan for a nuclear power plant spent inordinate time designing a bike shed.
The theory describes how people choose among uncertain or risky options and the role that bias plays in this choice. The Paradox of Choice explains that prospect theory involves the psychological effects of gains and losses. As we make a gain, we feel good. But, as we make more gains, the satisfaction shrinks. Likewise, when we incur a loss, we feel bad. But, when our losses increase, our dissatisfaction might not grow very much.
Cherise felt elated when she won $1 million in the lottery. With the prospect of doubling her winnings, she expected to feel twice as happy. However, when it actually happened, she didn’t feel much happier than when she won the first million. The more she accumulates, the less additional happiness she feels.
Using an objective measurement makes it easier for us to compare options. However, as the book Algorithms to Live By explains, this strategy can easily backfire. If the metric we’re using misaligns with what we’re trying to accomplish, we could end up chasing the wrong goals and sabotaging ourselves. This kind of misalignment is very difficult to spot. We have a powerful cognitive bias that causes us to equate the metric used to measure a strategy’s progress with the strategy itself. Psychologists have termed this phenomenon “surrogation.”
Carl bought an airline ticket based on its price and nothing else. He found himself on a cheap but miserable sixty-hour-long string of connecting flights, ruining his vacation. Instead of thinking about which airline ticket would result in the best vacation, he assumed that the cheapest airfare meant the best vacation.
Next on our list of cognitive biases is the unit bias. We tend to think that one unit of something is the right amount, so we want to complete it. Psychologists refer to this tendency as “unit bias.” In the book In Defense of Food, Michael Pollan discusses this bias in the context of food: People tend to believe whatever portion they’re served is the right amount to eat.
Cheyenne attended a drink-wine-and-paint event. She found a pint of paint and a bottle of wine at her station. Two hours later, the pint and the bottle were empty. Only her compromised judgment could allow her to appreciate the chaos that her canvas had become.
Cognitive Biases Related to Social Interaction
Some cognitive biases are associated with the way we interact with other people. Here’s a list of cognitive biases in this category.
Next on our list of cognitive biases is the association fallacy, also known as the halo effect or the liking/loving tendency. Poor Charlie’s Almanack explains that humans like and love certain things and people. In particular, we have an affinity for our mothers, much like goslings attach to whatever is there at birth. When we have an object of affection, we tend to ignore their faults and comply with their wishes. We like people, products, and actions that are associated with our object of affection.
Advertisers famously associate products with popular figures. When we like the people in the advertisement, we transfer some of that affection to the product.
We usually view our own misfortune through a different lens than we use to view others’ misfortune. We credit other peoples’ failures to a lack of skill; we blame our own failures on bad luck. This is called the attribution bias: the faulty thinking we use to explain our own and others’ behaviors.
Justin lost his job last year. Whitney recently lost hers. Whitney blames Justin’s laziness for his job loss, and she blames the economy for hers. Even if that’s factual, her assessment isn’t based on the facts.
In-Group, Out-Group Bias
The Art of Thinking Clearly explains that one aspect of humanity’s membership in groups is prioritizing our own group above others. This means that we magnify our group’s positive traits and minimize those of other groups: a phenomenon called in-group, out-group bias.
Even though Stefan’s choir came in third in the competition, he believes that his choir is better than the choir that won.
Next on our list of cognitive biases is the pygmalion effect. This is a self-fulfilling prophecy, in which our higher expectations for a person lead to performance increases in that person.
Give and Take discusses how this effect has been experimented with in a wide range of professional and educational settings. Usually, the experiment is conducted by having all students take a faux exam and then randomly assigning a percentage of students into a false “high potential group.” Instructors are then told which students were identified as high potential (again, in reality, since they were randomized, the high potential students are no different). In a variety of environments, the falsely labeled “high potential” or “bloomers” achieve better gains.
It’s so essential that we feel that we control our lives that we often rebel when we don’t feel that way. This is a phenomenon known as reactance. People who feel they don’t have control won’t do things they want to do—just to regain that sense of control.
In Influence, psychologist Robert Cialdini defines reactance as an adverse reaction we have to any restriction of our choices and explains that we don’t exhibit it if something is freely available (because we don’t feel restricted).
There’s a chocolate shortage, so Kandace wants chocolate even more than she did before. In other words, when control grows scarce, she wants control even more than she did before.
Social Comparison Bias
Next on our list of cognitive biases is the social comparison bias. The social comparison theory describes the human inclination to determine our self-worth based on how we compare with others. At its best, social comparisons can sometimes motivate us to improve—which raises our self-esteem—when we compare ourselves to people who excel in one particular trait but who are otherwise similar to ourselves. At its worst, social comparison can make us feel insecure or arrogant, depending on whether we make upward comparisons (judging ourselves against people we deem superior) or downward comparisons (judging ourselves against those we deem inferior).
The Happiness Trap explains that, in the world of social media, where everybody is pushing an idealized version of themselves, part of our brain looks at others and makes us worry that we compare unfavorably.
Cognitive Biases Related to Memory
Some cognitive biases are associated with the way we remember information. Here’s a list of cognitive biases in this category.
We want current behavior to be consistent with past behavior. We don’t want to change our minds. If our beliefs are attacked, we might double down and entrench, rationalizing it along the way.
Dale Carnegie, in the classic How to Win Friends and Influence People, discusses the consistency bias in the context of winning people over to our way of thinking. He argues that we must get them to say “yes” immediately because, every time someone says “no,” they get locked into defensiveness and consistency bias. Inertia builds. It becomes harder to influence them. Even if they later realize they need to change their mind, their precious pride gets in the way.
In Hooked, Nir Eyal mentions an experiment in which people were asked to place a large political sign in their yard. Some people also had first been asked to place a political sticker in their window. Those people were four times more likely to put up a yard sign than those who didn’t put up a sticker beforehand.
Next on our list of cognitive biases is the generation (or self-generation) effect. We’re more likely to remember things we’ve created ourselves versus things we’ve heard or read.
The Slight Edge recommends that we write out our vision of success or create a piece of visual art to represent it. For example, we could make a vision board. Creating a tangible representation of our vision is critical because it allows us to get clear and specific with our goals, take ownership of them, and keep them in the forefront of our minds.
Cryptomnesia is a case of false memories spun from forgotten details of a person’s life experiences. Many claims of past life regressions have turned out to be the result of cryptomnesia.
While under hypnosis in 1952, a Colorado housewife named Virginia Tighe recalled a past life as an Irishwoman named Bridey Murphy. The story collapsed, however, when researchers discovered that Tighe had lived across the street from an Irish immigrant named Bridie Murphy Corkell as a young child. They concluded that she must have stored this name as a buried memory, which was later unearthed and spun into an entire “past life” story under the suggestive power of hypnosis.
Mood-Congruent Memory Bias (State-Dependent Memory)
The state we’re in affects what we remember. Similar to the way that we can better recall information in the same place we were when we learned it, we can also better recall information when we’re in the same mood or physical state.
One study showed that people who learn information while drunk remember it better while drunk, while those who learn information sober remember it better sober.
Negativity Bias or Negativity Effect
Research shows that we’re hardwired to notice and dwell on negative events. Further, negative events have a greater impact on us than positive ones—our emotional responses are stronger for negative events than they are for positive ones. In other words, negative events feel more important to us than positive ones. Subsequently, negative events create a strong and vivid impression in our long-term memory, and they influence the decisions we make.
Consequently, we’re likely to notice, react to, and remember:
- Criticism more than praise
- Sad memories more than happy memories
- Bad news more than good news
- Our mistakes more than our successes
- Negative traits in others more than their positive traits
Andre’s performance review at work included several positive remarks and one negative one. He downplayed the positive points and focused on the negative point.
Next on our list of cognitive biases is the primacy effect. When we interpret multiple pieces of information, our natural tendency is to better remember and place more importance on the first information given to us.
The book Profit First recommends that we calculate profit in our accounting first. Then we’ll place the most focus and importance on it.
The recency effect says that we’ll better remember and focus more on information given to us most recently.
Shortly before Spencer took his biology exam, he reviewed the information he expected to be on the test. That’s the information he most easily recalled during the exam.
There’s a high chance that, if we were to look back on our lives and name the most memorable events, most of them would fall between our teenage years and about age 30. This is due to the “reminiscence bump”: the tendency for older adults to recall memories from their late adolescence and early adulthood more than from any other point in their lives. Chip Heath and Dan Heath discuss this tendency in their book The Power of Moments.
The reminiscence bump happens because, between the ages of 15 and 30, we experienced many of our “firsts.” Each of these events is especially memorable because they were a way of breaking the script of our life. These exciting departures from our pre-established way of doing things built peaks into the relative flatness of our life leading up to age 15 (as fun as childhood is, it’s a lot of the same thing every day). After about age 30, our “firsts” reduce drastically and our life becomes a fairly flat script, as we’ve already experienced most of the naturally-occurring opportunities to try new things that life has to offer.
Jaclyn, a 60-year-old, remembers her first kiss, her first job, and her first apartment better than events that happened in her 40s.
Next on our list of cognitive biases is the self-serving bias. We tend to own positive outcomes—attributing them to skill—and we brush off negative ones as bad luck. We don’t want bad things to be our fault. It’s tempting to give in to self-serving bias even if we’ve accepted that most outcomes result from a combination of luck and skill.
Annie Duke discusses this bias in her book Thinking in Bets, explaining that it can make us overlook the learning opportunity in a mistake—or it can cause us to double down on beliefs that are holding us back.
An actor who can’t find work tells himself that he’s just unlucky or he hasn’t come across the right role yet. He’s always been praised for his talent; he knows he’s good enough. He might be right that it’s just bad luck stopping him. But maybe he needs to try auditioning with a different monologue or learn new techniques. His certainty about his own talent might be preventing him from experimenting with new, potentially more effective, approaches.
The spotlight effect is our (largely mistaken) perception that others are paying close attention to us and judging us.
In the book Nudge, Nobel Prize-winning economist Richard Thaler and renowned legal scholar Cass Sunstein discuss an experiment that illustrates the spotlight effect. After determining that Barry Manilow was the most embarrassing celebrity to advertise on a T-shirt, researchers had student subjects wear a Barry Manilow T-shirt and fill out a questionnaire alongside fellow students. The researchers then made up an excuse to remove the student from the room and then asked the student to estimate how many of his or her fellow students could identify the person on the T-shirt. The average estimate was just under 50%, but the researchers found the actual number was closer to 20%. (The researchers also had subjects watch the classroom on videotape, and these dispassionate observers estimated around 20% as well). In other words, wearing the shirt increased the students’ self-consciousness.
Next on our list of cognitive biases is the testing effect. We’re more likely to remember something we’ve learned if we’re tested on it. In explaining the basis for intentional recall practice, Barbara Oakley brings up the testing effect in her book A Mind for Numbers. She points out that, when we practice intentional recall, we’re basically testing ourselves, so intentional recall allows us to take advantage of the testing effect in our own study.
In psychology, the principle that active recall is more effective than passive studying is known as the “testing effect” because it was initially studied using tests. The researcher would present the same material to two groups of students. One group would then take a test on the material, while the other group would not. Later, both groups would be tested on the material, and the group that took the first test would do better.
Before a task is completed, our brain keeps it at the forefront of our memory. However, once the task is completed, we immediately forget about it. In The Art of Thinking Clearly, Rolf Dobelli explains that this makes our brain efficient. Our brains hold information as long as necessary. But, once that information is deemed unimportant, our brains forget it to free up mental space for the next, important piece of information. Completed tasks are considered unimportant and thus discarded. There’s one exception to this rule, Dobelli adds. If you have a number of tasks to complete, making a concrete plan to deal with them can signal your brain to forget the tasks before you finish them.
Despite the Zeigarnik effect’s popularity in productivity books, most modern psychology books don’t mention it because researchers haven’t been able to replicate it reliably.
For weeks, Jorge had organizing the garage on his to-do list. The thought nagged at him. Once he got the job done, the nagging thoughts went away. His mind moved on to organizing the kitchen drawers.
While this list of cognitive biases is not exhaustive, it includes the most common biases that we encounter in ourselves and those around us. The more we understand them, the better we can recognize them and combat them—or leverage them—as necessary.
Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .
Here's what you’ll get when you sign up for Shortform :
- Complicated ideas explained in simple and concise ways
- Smart analysis that connects what you’re reading to other key concepts
- Writing with zero fluff because we know how important your time is