PDF Summary:Being Wrong, by

Book Summary: Learn the key points in minutes.

Below is a preview of the Shortform book summary of Being Wrong by Kathryn Schulz. Read the full comprehensive summary at Shortform.

1-Page PDF Summary of Being Wrong

Making mistakes and believing things that aren’t true are unavoidable parts of human nature, and that’s not necessarily a bad thing. After all, it’s our beliefs, right or wrong, that let us function in our daily lives, and it’s only through mistakes that we can learn and grow. In Being Wrong, journalist Kathryn Schulz argues that we shouldn’t try to avoid mistakes completely. Instead, we should change how we view our mistakes—and maybe even laugh at ourselves once in a while.

In this guide, we’ll explain how mistakes enrich our lives, why our senses and beliefs often fool us, and how to deal with making mistakes without falling victim to shame and denial. We’ll also draw on the wider world of psychology and science to explore the reasons behind human error, how to accept it in a healthy way, and how we can dodge mistakes or use them to our advantage.

(continued)...

The Right Words Shape Your Reactions

Schulz uses people’s common word choices to reflect how making mistakes makes us feel, but the relationship between words and feelings goes both ways—the language we use can influence our feelings, rather than our feelings dictating how we speak. Research has shown that putting negative feelings into words lessens their impact by shifting where the brain processes emotion from the primitive amygdala to the more highly developed prefrontal cortex, where reason and language mitigate strong emotions. If this is the case, then admitting to error and describing how it feels can literally change which part of your brain you use to cope with the fallout of making mistakes.

Just as giving voice to the pain of making mistakes helps to soothe it, so too can the use of language steer your brain toward seeing mistakes in a more positive light. In The Happiness Advantage, Shawn Achor argues that the deliberate use of positive language—such as keeping a gratitude journal and writing down positive experiences in your life—will retrain your brain to see positive outcomes rather than dwelling on disappointment. In regard to the experience of error Schulz describes, a practiced and deliberate positive outlook can turn viewing mistakes as “happy surprises” into a more common occurrence.

However, Schulz points out that the feelings we associate with being wrong always occur after the fact. In the instant you call your friend the wrong name or make an erroneous statement, you don’t feel as if you’re wrong at all, because in your mind you’re not. Therefore, the real-time experience of being wrong is no different from the experience of being right. What’s more, even when you discover that you’re wrong, you do so by replacing the wrong belief with a corrected belief. In other words, you don’t change from being wrong to right, you change from feeling that you’re right in one way to feeling that you’re right in a different way. The pain of mistakes is the pain of having been wrong, not that of being wrong in the present.

(Shortform note: The fact that everyone always feels like they’re right is part of what makes resolving conflicts difficult. Since, as Schulz says, the pain of being wrong comes after the realization, then suggesting that either side in a dispute is wrong invites them to feel that pain, which, of course, they’ll resist. This is why the authors of Difficult Conversations insist that arguing about who’s right or wrong is a useless approach to resolving disputes. What they suggest instead is to try to understand why each party believes they’re right so that everyone involved can understand why their “rights” are in conflict with each other.)

Schulz suggests that people don’t change a belief until they have a new one to replace the old one. For instance, in a flash, you’ll change one belief—“I left my car keys on the table”—to a corrected belief—“I dropped my car keys in the laundry.” This makes the moment of transition so elusive that in most cases it can’t be pinned down. It either happens instantly, as in the example of your keys, or so gradually that you don’t notice the change, such as a shift in a political view that happens over several years. We rarely, if ever, get stuck in the transition because, as a rule, we don’t reconsider beliefs in isolation—we always need two or more to compare, which can sometimes make changing a belief take a while.

(Shortform note: The mental frameworks that support our beliefs can be even more complex than Schulz presents. Just as beliefs don’t exist in isolation, they’re also reinforced by habits and behaviors based on those beliefs. In A Mind for Numbers, Barbara Oakley argues that to change an ingrained habit, such as using caffeine to stay awake at work, you have to change your beliefs about that habit as well as believing that the change will do you good. In this case, you’d have to believe a good night’s sleep is more beneficial than caffeine with none of the side effects. Many experts on personal growth acknowledge that changes like this take time—the delicate balance of belief and behavior has to be reordered in a way that won’t collapse.)

Certainty and Doubt

How tightly we cling to any given belief falls somewhere on the spectrum between doubt and certainty. Schulz argues that certainty is the default position for more of our beliefs than we realize, and while that’s necessary for us to get through the day, it carries inherent dangers that only the discomfort of doubt can assuage.

To be certain about the basic rules of life is more than a comfort, it’s a necessity. Consider how many beliefs we take so much for granted that we don’t even think of them as beliefs. We’re sure that an object will fall if we drop it. We’re sure that the sun will come up in the morning. We’re sure we need food and water to survive. Schulz explains that our whole understanding of the world is built on a framework of beliefs so fundamental that it wouldn’t make sense to doubt any of them. How could you function for a moment if you doubted that gravity works or that you need air to breathe?

(Shortform note: Schulz presents our most basic beliefs as questions of fact, such as air and gravity, but how these beliefs take root may go beyond simple matters of factual accuracy. In Maps of Meaning, Jordan Peterson argues that subjective experience is more important to the mind than objective observation. In other words, in our everyday lives, we don’t think of gravity as a physical law, but in terms of how we experience its effects and how we behave in response. Furthermore, in The Art of Thinking Clearly, Rolf Dobelli asserts that accepting our most basic beliefs is a biological imperative that’s been programmed into us by evolution itself. After all, as Schulz herself makes clear, doubt can be a poor survival strategy.)

Schulz says that certainty gets us into trouble because many core beliefs are tightly wrapped up in our sense of identity—such as the belief that our tribe will protect us and therefore we owe it our allegiance in return. Anything that calls such a belief into question feels like a personal attack. We instinctively double down on being certain instead of giving in to the discomfort of doubt, so if someone suggests that our core beliefs are wrong, we stop listening and deny the other person’s views. We tell them they’re misinformed, or stupid, or worse. When the threat of doubting our core beliefs is existential, we may even decide the other person is evil and out to destroy all that’s good in the world.

(Shortform note: Whereas Schulz presents doubting core beliefs as a source of existential dread that closes your mind to new perspectives, motivational experts often advise changing core beliefs as a path toward success. In Awaken the Giant Within, Tony Robbins says you should identify which of your core beliefs are harmful and formulate new beliefs to take their place. Likewise, in Atomic Habits, James Clear asserts that identity is just a point of view that evolves over time, and that because of this, you can choose whatever identity you want by challenging old beliefs and adopting new ones. Both Robbins and Clear’s approaches involve shifting from “wrong” to “right” in incremental steps so as not to be daunting or painful.)

When confronted with someone else’s blind certainty, it’s easy to see where absolute faith in your beliefs can fall short, but it’s hard to recognize that same failing in ourselves. Sometimes we admire doubt in other people, such as when someone admits that they were wrong, though often we view doubt as a sign of weakness. The truth is that doubt provokes anxiety, which is why we find certainty so appealing. Schulz writes that people who wear their doubt on their sleeve, such as undecided voters or religious agnostics, make others uncomfortable not because they’re undecided, but because they display an openness to their own potential for error that undermines the sense of certainty that everyone else relies on.

(Shortform note: In spite of the doubt-based anxiety that Schulz references, religious agnostics and independent voters are both growing as segments of the population, at least within the US. In politics, independents increasingly blame the polarization between the major parties for producing agendas too far to either end of the political spectrum. Meanwhile, those professing to be of no religion haven’t necessarily stopped believing in God, though they’ve turned their backs on organized religion. Therefore, in many cases of political and religious doubt, people may not be challenging their own beliefs as much as they’re questioning the validity of the organizations that purport to represent them.)

Why You’re Wrong

The fact that human error is practically a given implies that there’s something fundamentally wrong with how human beings perceive the world. That would be true if the purpose of the brain was to perfectly analyze information from your senses, but it’s not. Instead, the brain is optimized to make judgments quickly from limited data and choose the best behavior to promote our survival. As a result, our minds are built on heuristics—mental shortcuts to efficient, best-guess thinking based on limited information, mental models, and the collective judgment of whatever groups we’re a part of. According to Schulz, these tools make our minds amazingly efficient, but they’re also the loopholes through which we make mistakes.

Perception and Reason

The most basic and natural mistake that we make is to trust the evidence of our senses without question. The senses are the mind’s only window on the world, but we underestimate the degree to which that window is clouded by how the brain filters data through an unconscious interpretive process, followed by instinctive reasoning that doesn’t rely on strict rules of logic.

Schulz explains that our conscious minds don’t receive the “raw data” from our senses. Instead, the information we see, hear, and feel is touched up and processed by our unconscious nervous system, such as the way the brain’s visual cortex fills in the gaps between the still frames of a movie. Our brains are designed to fill in the gaps in any sensory data we receive. Doing so gives us a more cohesive awareness of our surroundings, which therefore aids our survival in the wild—but it also opens the door to error, particularly when the brain’s “best guess” to fill those gaps turns out to be wrong.

(Shortform note: Research into the neurological process Schulz describes has shown that in humans—and primates in general—object recognition involves more of the brain than the visual processing centers alone. When an object is partially hidden, giving the visual cortex limited information, neurons start to fire in the sections of the brain associated with memory and problem-solving. This suggests that the “evidence of our eyes” is guided just as much by memory and context as it is by what we actually see. The same researchers also suggest that a breakdown of this process may account for some of the perceptual difficulties experienced by people with autism or Alzheimer’s disease.)

Our misperceptions are amplified when we use them to make decisions, a process which also relies on “best guess” logic more than we’d like to admit, writes Schulz. Rather than using careful logic to make decisions and judgments, our brains default to inductive reasoning—determining what’s likely to be true based on past experience. This mental shorthand allows for quick decisions that tend to be mostly right most of the time, but inductive reasoning also leads us into various cognitive traps, such as making overly broad and biased generalizations while ignoring information that doesn’t jibe with our beliefs. As with our sensory errors, our imprecise reasoning is a natural side effect of the processes that make our brains so efficient.

(Shortform note: Despite the benefits Schulz describes, semi-logical reasoning can also make us vulnerable to mistakes based on rhetorical fallacies—in other words, we can be persuaded to be wrong. In Thank You for Arguing, Jay Heinrichs explains that to be persuasive, an argument doesn’t have to be right, it only has to feel right. To guard against falling prey to such “reasoning,” you have to watch out for several logical-sounding fallacies you’ll come across, such as weak evidence, false dichotomies, straw-man arguments, and false correlations, all of which satisfy the “best guess” logic that Schulz asserts our brains rely on.)

Belief and Imagination

Our potentially faulty judgments based on erroneous sensory data form the shaky foundations on which we build beliefs, giving us even more chances to be wrong. Our beliefs shape everything we do, and they’re so interwoven and interconnected that admitting one of them is wrong can threaten our entire framework of understanding. Schulz discusses how beliefs are formed and how easily we invent them on the fly with little or no evidence to back them up.

Beliefs are stories we tell about the world. We’re conscious of some, such as beliefs about money, but unconscious of others, such as which way is “down.” We cling to some beliefs very tightly, while others change more easily, depending on how important they are. Schulz says that we automatically form beliefs about every new thing or idea we encounter, because otherwise we wouldn’t have a way to determine how to act or predict what will happen. This belief formation is a two-pronged approach—part of your mind creates a story to explain what your senses tell you, while another part of your mind checks your story against further input from your senses. Either side of this process can break down, resulting in beliefs that are wrong.

An example in which your senses and storytelling contribute to faulty beliefs is that of a well-performed magic trick. In this instance, though, the limits of your senses are deliberately “hacked” by the magician. Magicians conceal the mechanics of the trick so that no matter how closely you watch, you're deprived of essential information about it, just as Schulz asserts that we are in most situations we experience. Meanwhile, the magician provides a running commentary to shape the narrative forming in your mind—one that’s at odds with the reality of how the trick is actually performed. Even when you know that what you’re seeing isn’t possible, it’s hard to disbelieve the evidence of your eyes and the story the magician tells you.

Your Mind May Deceive You

One example of how an illusionist can create a visual narrative that’s at odds with reality is the classic cup-and-balls magic trick in which a ball (or some other object) seems to transport itself through solid matter, from underneath one cup into another. This trick was famously ruined by the magicians Penn and Teller who perform the trick using clear plastic cups so that the audience can see exactly how they’re being fooled. But as Schulz points out regarding belief, even when Penn and Teller explicitly show how they pull off the illusion, the props in the trick still seem to magically move.

In a broader sense, the process by which your senses interact with your internal narrative to understand the things you experience is built into the brain’s basic structure. In My Stroke of Insight, neuroscientist Jill Bolte Taylor explains that as you take information in from your senses, your brain’s right hemisphere builds a sensory picture of the world around you, while your brain’s left hemisphere takes that sensory picture and classifies what’s in it using language and numbers. The left brain creates a story based on what the right brain perceives. This chain of processes creates multiple steps where the cognitive errors that Schulz discusses can inject themselves into our mental models of the world.

We’ve already mentioned how senses can fail, but the storytelling aspect of your brain goes wrong when you spin beliefs from sheer imagination. Schulz acknowledges that imagination is an evolutionary gift that lets us solve problems we’ve never faced before, but it goes wrong when we invent stories without any evidential grounding. We do this because our minds crave answers, so we feed them by making up theories. For instance, if your cat disappears and shows up again, you’ll automatically start guessing where it went. The trouble is that these guesses quickly solidify into firmly held beliefs. Admitting that you don’t know something is more uncomfortable than pretending that you do, hence the temptation to believe things too strongly.

When Guessing Goes Wrong

When your mind creates guesses out of whole cloth, it doesn’t do so entirely at random. In Thinking, Fast and Slow, behavioral psychologist Daniel Kahneman—who’s made a career out of studying human error—lists several basic mental heuristics which often lead us astray. These include but are not limited to:

1) Making judgments based on too few examples. For instance, if you see two police cars near each other, you may conclude that there are many more police patrolling the area in force.

2) Seeing patterns where none exist. In the US, it often seems like mobile homes are somehow targeted by tornados, when in fact they’re no more likely than any other structure to be in a tornado’s path.

3) The Anchor Effect. When presented with several pieces of information, you tend to focus on what you’re shown first. For example, if a salesperson gives you a range of prices and then asks you how much something should cost, your answer is likely to veer toward whatever number they mentioned first.

Schulz speaks of the instinctive nature by which we use these and other heuristics to form beliefs and make decisions even when there’s little to no information. But in line with Schulz’s thesis that making mistakes isn’t always a bad thing, Kahneman argues that the brain’s error-prone heuristics can be consciously leveraged to help you lead a happy life.

Social Pressure to Believe

So far in this guide, we’ve discussed being wrong as if it only happens on an individual level. However, we don’t form our beliefs on our own, and history has shown that large groups of people can all be wrong at once. Try as you might, there’s no way to avoid learning beliefs and behaviors from the people around you, and when you’re firmly embedded in a group, any fallacies in the thinking of that group are only reinforced by the strength of group identity.

“Think for yourself” is common advice, but unfortunately, you can’t do it. We all rely on other people’s knowledge—there’s too much in the world to learn on our own. The problem is telling whether or not someone else’s beliefs are worth sharing. Schulz argues that we generally don’t judge someone else’s beliefs on the merit of their ideas. Instead, we first decide if someone else is trustworthy—if they are, we accept their beliefs. This is a time-saving shortcut that lets us learn from teachers and parents, determine which news articles to read, and decide which opinion podcasts to listen to. However, this shortcut opens up a world of error because it multiplies our own faulty judgment by that of many others. Mistakes spread like a plague.

(Shortform note: The power of groups to propagate mistakes is so strong that in The Design of Everyday Things, psychologist Don Norman argues that systems which rely on human judgment must factor social pressures that magnify human error into their design. He writes that it’s not enough—or even productive—to assign individual blame if some facet of a system or an organization is the root cause of the errors its members make. Norman’s solution is to engineer a system with the assumption that people will be wrong and make mistakes so that those errors can be detected before they spread. Unfortunately, the social systems Schulz describes aren’t engineered, but grow organically, making them vulnerable to faulty human nature.)

Schulz says that in years past, we formed many of our beliefs based on the groups we were raised in, but in the Information Age, we seek out and form groups based on shared beliefs. Group consensus is a powerful drug, and in a group based on common ideas, belief in those ideas is self-reinforcing, while any evidence against them is ignored by the strength of the group’s willful blindness. For instance, consider how strongly groups of music fans react to criticism of the artists they enjoy, even when those artists’ work is in decline.

When social status and group membership are defined by your agreement with certain beliefs, then any dissent is an attack on the group and can be punished by shunning, expulsion, or worse. Human beings are social animals, and it’s easier to go along with questionable ideas than lose your group status.

(Shortform note: Research into the behavior of online groups has shown that the strength of group consensus is directly proportional to the group’s size. Potentially more problematic is how the same studies show that group consensus can easily be swayed by artificial “bots” that pretend to voice opinions as group members. When enough of these bots are in agreement with each other, the opinions of the entire online group can shift, showing how easily large social networks can be misled by a handful of bad actors leveraging modern information technology.)

How to Cope With Being Wrong

We’ve all felt the pain of making small mistakes—acknowledging the big ones throws us into turmoil. If you find out you’ve been wrong on a major level, especially about a core belief that part of your worldview hinges upon, it can trigger a full-blown existential crisis. As with grief, there are stages you’ll inevitably go through, including gauging the scope of how wrong you’ve been, denying your error or perhaps defending it, before hopefully accepting that you were wrong and finding a way to grow in response.

Schulz says that when you learn you’ve been wrong, your first question will be to ask, “By how much?” Determining the scale of how wrong you are determines how many of your beliefs you’ll have to change, and how truthfully you can answer that question depends on how well you can take the emotional punch of admitting your mistakes. Your initial response will also include a measure of denial as a defense mechanism. Short-term denial isn’t necessarily bad. It can give you enough emotional breathing room to face up to your mistake once the initial shock has passed. Long-term denial is a different story. Instead of healthy growth, it’s rooted in deceit—lying to others and lying to yourself.

(Shortform note: While Schulz advocates a somewhat gentle approach to assessing your mistaken beliefs, hedge fund manager Ray Dalio recommends a more stringent form of self-reflection. In Principles, Dalio says that you must pursue the truth of any situation, and that to do so you have to always embrace the possibility that your views are wrong. The first challenge that Dalio draws attention to is overcoming the hurdle of your own ego—instead of assuming that your views are right, you must always ask, “How do I know I’m right?” The next step is to work around your blind spots by staying receptive to ideas that challenge your beliefs, a skill that requires humility and a degree of self-awareness.)

No two people react to being wrong in the same way, though Schulz writes that there are patterns. The storytelling part of your brain will generate theories about why you were wrong, and many of these will be defensive in nature. You may tell yourself that you were almost right, or that you were only wrong in certain ways—like arguing your company’s new product would have succeeded if only the timing of its release had been better. You might shift the blame for being wrong on someone else, or you might claim that you’d been right to be wrong—that you were erring on the side of safety. Whatever your approach, what you’re defending against is the pain of being wrong, not the fact of your mistake.

(Shortform note: The defensive measures that Schulz describes treat your beliefs as if they’re precious possessions. In Thinking in Bets, Annie Duke suggests a method to short-circuit your defensive tendencies while protecting you from other mental fallacies. The method is conceptually simple—imagine that you have to place a bet on every belief that you hold. How much money would you be willing to risk on any single one of your cherished beliefs? Duke’s method forces you to confront your ideas with a modicum of doubt and tricks you into weighing your beliefs from a rational, rather than emotional, perspective, which she contends can make you more open-minded and willing to consider alternate points of view.)

Accepting That You’re Wrong

Though Schulz often claims that not all mistakes are bad, responding to them poorly never ends well, and the results of denial can sometimes be disastrous. Whether it’s minor or major in scope, the healthiest way to deal with a mistake is to accept it and use it to grow. To be ready for this, you have to be open to the possibility that you can be wrong, acknowledge that your beliefs are always changing, and face up to the fact that the fight against being wrong is a never-ending struggle.

The upside of learning you’ve been wrong about something is that it opens you up to change and exploration. First, however, Schulz recommends that you cultivate an openness to the chance you could be wrong about a great many things. The manufacturing industry already models this by thinking through every possible way the systems they design might fail. On a personal level, you can start to practice this by using the language of equivocation—“maybe, possibly, I’m not sure”—instead of speaking and thinking from a place of certainty. Our culture paints uncertainty as weakness, but that’s just another way that our culture may be wrong. Uncertainty helps you stay open to error and cushions the blow of admitting to mistakes.

(Shortform note: Schulz isn’t the only author to recommend doubt as a pathway to growth. In The Subtle Art of Not Giving a F*ck, Mark Manson offers a three-step process to cultivating doubt as a practical skill. The first step is to ask yourself what if you’re wrong in any particular situation. The follow-up question is “What would the consequences be if you’re wrong?” The last step is to think about those consequences. None of Manson’s steps require admitting that you’re wrong—Manson, like Schulz, only asks that you leave the option on the table and do the mental work to take it seriously.)

To be sure, if you learn that you’ve been wrong in a way that’s fundamental to your sense of self—something that might change your religious beliefs or whether you want to continue in your career—you’re going to go through a painful transition. Nevertheless, Schulz points out that our identities, based on our changeable beliefs, are always in flux. Change is a natural, if painful, part of life that we all experience in one way or another. It may be easier for such change to happen slowly, but there’s always a sense of loss associated with it. The trick is to change your attitude toward being wrong so that you can see it for the lessons that it brings instead of the pain it makes you feel.

(Shortform note: Since Schulz defines right and wrong in terms of change instead of truth, then the entire question of making peace with your mistakes is really a question of how well you handle change. In Switch, Chip and Dan Heath make the case that change can be successfully managed if you engage both your rational side and your emotions. You can guide your intellect through difficult changes by seeking out the stories of those who’ve gone through a similar experience while leveraging both positive and negative feelings to push change forward, depending on the context of your situation. Above all, they recommend making sure to celebrate progress along your path toward change rather than regretting the mistakes that came before.)

The final problem with admitting you’re wrong and adopting a different set of beliefs is dealing with the chance that you might still be wrong. Schulz writes that correcting your mistakes is a never-ending process, and unless you stumble on an absolute truth, you’ll have to deal with being wrong for the rest of your life. So is there a point in even trying? Of course! Not only is there value on the path of self-improvement, even if it’s a journey with no end, being open to your own capacity for error teaches you to be compassionate toward other people and the unique mistakes they make. The more that people can accept their own errors, the more open they are to new beliefs and ideas, and the more they can see the world through someone else’s eyes.

(Shortform note: An author who embodies the process of change that Shulz describes is British radio host James O’Brien. In How Not to Be Wrong, O’Brien explains how a personal crisis led him to re-examine the culture of toughness and physical violence that shaped the views he espoused on his program. O’Brien realized that what he came to see as his very wrong beliefs about racism, shaming others, and mistreating children not only affected his personal life, but negatively impacted other people through his show. O’Brien concluded that his misconceptions were the result of his childhood trauma, and by following a path such as Schulz recommends, he now advocates for being vulnerable and showing compassion to others.)

Want to learn the rest of Being Wrong in 21 minutes?

Unlock the full book summary of Being Wrong by signing up for Shortform.

Shortform summaries help you learn 10x faster by:

  • Being 100% comprehensive: you learn the most important points in the book
  • Cutting out the fluff: you don't spend your time wondering what the author's point is.
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.

Here's a preview of the rest of Shortform's Being Wrong PDF summary:

What Our Readers Say

This is the best summary of Being Wrong I've ever read. I learned all the main points in just 20 minutes.

Learn more about our summaries →

Why are Shortform Summaries the Best?

We're the most efficient way to learn the most useful ideas from a book.

Cuts Out the Fluff

Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?

We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.

Always Comprehensive

Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.

At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.

3 Different Levels of Detail

You want different levels of detail at different times. That's why every book is summarized in three lengths:

1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example