PDF Summary:The Art of Thinking Clearly, by Rolf Dobelli
Book Summary: Learn the key points in minutes.
Below is a preview of the Shortform book summary of The Art of Thinking Clearly by Rolf Dobelli. Read the full comprehensive summary at Shortform.
1-Page PDF Summary of The Art of Thinking Clearly
In The Art of Thinking Clearly, Rolf Dobelli breaks down the most common logical fallacies that inhibit decision-making, including confirmation bias, social proof, and hindsight bias. Dobelli aims to help people recognize and overcome these fallacies so they can make better decisions.
In this guide, we’ll explore Dobelli’s main fallacies, including those caused by humanity’s past as hunter-gatherers and those caused by other sources such as misinterpretation of cause and effect. Along the way, we’ll compare and contrast Dobelli’s ideas with other experts on logical thinking, such as Nassim Nicholas Taleb and Daniel Kahneman. We’ll also provide concrete steps to overcome illogical thinking and explore why logical fallacies occur.
(continued)...
(Shortform note: Some people argue that people’s difficulty with complex math concepts is a result of how math is viewed. People internalize the idea that math is difficult and those who struggle with the concepts stop trying to understand them. If math was treated like a language, which takes practice but can be learned by anyone, people would learn complex math easier.)
Here are some situations in which struggling with math negatively impacts your decisions:
The Distribution of Averages
Averages are one of the complex math concepts that your brain isn't evolutionarily prepared for, Dobelli explains. One of the biggest pitfalls when working with averages is ignoring the distribution: the original set of numbers used to calculate the average. Without knowing the distribution, averages are misleading because they don’t show the outliers: the extremes at either end of the distribution that drastically change the average. To get a true average, these outliers must be removed, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.
Averages and Scalable Events
Dobelli’s discussion of distribution and outliers finds parallels in Nassim Nicholas Taleb’s The Black Swan. Taleb sorts events into two categories: scalable and non-scalable. Scalable events have no defined limits, while non-scalable events have defined limits. (Taleb notes that Black Swan events—events that are unpredictable yet highly influential—occur solely in scalable situations.)
Most natural events are non-scalable. For example, there’s a defined limit to how much weight a human can lift; strength and weakness revolve around an average, and there are few outliers in that average’s distribution.
On the other hand, many man-made situations and ideas are scalable, with outliers in their averages’ distribution. There’s no upper limit to wealth, for example, which allows for the existence of billionaires—outliers in the distribution of global wealth. The presence of a single billionaire can significantly raise the average wealth of a town, making this average misleading—most people will earn well below the skewed average. Thus, when dealing with scalable situations, understanding averages and their distribution is important to gain an accurate picture of the situation.
Self-Selection Bias
Statistics is another area of math that you’re not evolutionarily primed for, Dobelli says. One statistical error is self-selection bias, in which the nature of the participants in a study influences its outcome. Specifically, people only join studies they’re comfortable with responding to, which alters your data, Dobelli says. Those who might provide embarrassing or somehow “undesirable” responses simply won’t take part, narrowing your study’s scope and skewing the results.
(Shortform note: The only way to completely eliminate these problems is studying people who don’t know they’re being studied. However, researchers must receive consent from participants, so this isn’t practical. That said, you can limit self-selection bias. Most studies do this by collecting demographic information: Researchers look for patterns in the demographics of people who chose to participate and alter how they weigh the responses to reduce self-selection bias. Specifically, they give greater weight to results from those less likely to self-select, and lesser weight to results from those likely to self-select.)
Your Memory Is Not as Reliable as You Think It Is
Next, we’ll cover fallacies related to memory. People believe their memories are untouchable, stored away and recalled when needed in perfect condition. However, this isn't the case, Dobelli warns. Your memory is affected by your feelings, opinions, and situation.
(Shortform note: Your memories are affected in these ways at several points: First, whatever you were feeling in the moment is tangled up with the actual situation in your memory; later, every time you remember the situation, your current mental state further alters your memories. Thus, the more you remember a situation, the more distorted the memory becomes.)
In this section, we’ll look at some of the ways in which your memory is unreliable.
Falsification of History
The main reason your memory is unreliable is that your brain is constantly rewriting your memories, Dobelli explains. This is called falsification of history. As your opinions and worldview change over time, your brain alters the details of your memories, making you remember the past in a way that better matches your current opinions and worldview.
(Shortform note: Your brain rewrites memories in this way to be helpful: By updating the information, your memories become more relevant to the current moment and your current decisions. However, rewriting memories also means you become overconfident in your beliefs: When you think you’ve always held the same beliefs, you won’t feel the need to challenge them.)
The Primacy and Recency Effects
Your memory is also influenced by the order in which you receive information and how much time has passed since you received said information. According to Dobelli, the first information you receive is initially easier to remember than information introduced later. This is called the primacy effect. However, this only works for a short time, as the information eventually leaves your short-term memory. After that, whatever information you heard most recently is easier to remember. This is the recency effect.
(Shortform note: How do these effects work? In the case of the primacy effect, when you learn a piece of information early, your brain has more time to repeat it. This keeps it in your short-term memory for longer, until it can be transferred to long-term memory. As for the recency effect, when you learned a piece of information recently, the information is still in your short-term memory and so is easy to recall. You can manipulate these tendencies by memorizing important information first to trigger the primacy effect and reviewing information before you need it to trigger the recency effect.)
You Misinterpret Cause and Effect
In this section, we’ll cover how misinterpreting cause and effect damages your judgment. According to Dobelli, humans struggle to interpret cause and effect because they confuse correlation and causation. When two events coincide, people assume there’s a causal relationship between the two of them, even when there’s not.
(Shortform note: How do people make these mistaken links? They take their knowledge of the effect and look for any similar events that might point to a cause, regardless of the likelihood of that similar event actually being the cause. In other words, they look for possible correlations between the events and mistake this for one causing the other.)
Association Bias
One type of misrepresentation of cause and effect is association bias, or the brain’s tendency to make connections where none exist. Dobelli says this misrepresents cause and effect by forming false knowledge, where you falsely causally connect two unrelated things.
Superstitions form this way, Dobelli explains. For example, say you bring rainboots when camping, and the weather is perfect. The next time you go camping, you leave the rainboots behind and the weather is awful. The next time you bring them, the weather is wonderful again. After a few of these experiences, your brain connects the boots and good weather, even though it’s just a coincidence that the weather improved when you brought the boots.
(Shortform note: Why does association bias occur? Dobelli doesn’t say, but others argue that association bias is a defense mechanism: Making connections helps you form “protective frames.” These are practices or support systems that let you evaluate risk (for example, the risk of it raining when you go camping). In our example, the brain mistakenly created a frame in which the presence of rainboots reduces the risk of rain.)
The Fallacy of the Single Cause
Another way people misrepresent cause and effect is by oversimplifying: To make a simple pattern of cause and effect, people simplify to a single cause. This mindset is dangerous because everything is affected by a complex web of causes, Dobelli states. There’s never a single cause for complex effects like crime or success. (Shortform note: Problematically, if you simplify to a single cause, you’ll also simplify to a single solution. For example, if you believe high illness rates are solely due to unaffordable healthcare, you’ll work only to make healthcare affordable. Your singular focus means you don’t realize that other factors like safe housing and income must be addressed too.)
You Struggle to Understand Probability and Predictions
The next set of fallacies we’ll cover revolves around probability and predictions. Dobelli says people hate uncertainty and try to predict future events to alleviate that uncertainty. However, to make accurate predictions, you must understand probability, which humans struggle with. Thus, people’s predictions are usually inaccurate.
(Shortform note: Even though humans are proven to struggle with probability, and predictions are notoriously unreliable, people still make a living estimating probability and making predictions. This is a form of authority bias: You assume that if the person is making a prediction, they must have based said prediction on experience. However, no matter how knowledgeable the person is, they’ll struggle to process the information needed to make accurate predictions.)
In this section, we’ll cover ways people misunderstand probability and make inaccurate predictions.
Neglect of Probability
According to Dobelli, people struggle to make good decisions because they neglect to consider the probability or risk involved in those decisions. Logically, they should choose the option with the highest probability of going well for them and the lowest risk of going badly. However, people instead choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur.
(Shortform note: A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating as the numerator and the total number of possibilities as the denominator, you’ll base your judgment of risk solely on the numerator and ignore the denominator. This means you’ll regularly misinterpret probability, since the numerator depends on the denominator to accurately show probability.)
Hindsight Bias
The next fallacy we’ll cover is hindsight bias. Dobelli says hindsight bias makes past events seem like they should’ve been easily predictable. People see an obvious pattern of circumstances that led to a past event occurring, and they think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.
Hindsight bias encourages overconfidence, Dobelli says. You think you’re good at detecting patterns when really, you’re not: You’re only seeing them because of hindsight. You thus fail when trying to apply these pattern-spotting “skills” to predicting the future.
(Shortform note: Past events seem obvious because of how your brain predicts: When shown two possibilities, your brain creates reasons why both are possible. However, once Possibility A is proven, your brain doesn’t need to retain information about Possibility B. It forgets that information, making you believe Possibility A was obvious all along. This altered memory also creates overconfidence. You forget any prior uncertainty or incorrect predictions, which reinforces your overconfidence about your pattern-finding and prediction abilities.)
You Value Things for Arbitrary Reasons
In this final section, we’ll cover fallacies that affect how you value things. According to Dobelli, humans tend to put value in a person, situation, or item for arbitrary and illogical reasons.
The Endowment Effect
One illogical shift in your valuation of an item is that when you own an item, you subconsciously overinflate its value simply because it's yours, Dobelli explains. This is called the endowment effect.
(Shortform note: This effect stems from loss aversion. Once something is in your possession, you fear losing it, which makes you value the item more. You can avoid this fallacy by avoiding personal connections to items: However, doing so may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.)
Liking Bias
Liking bias also affects how you value people, specifically. The more you like someone, the more value you put on their opinions and desires, Dobelli says. This means you’re more likely to do something for an individual you like, even if doing so goes against your own interests.
(Shortform note: Dobelli doesn’t say why liking someone makes you value them more. Some experts say you value people you like more because when you like someone, you form an alliance with them. Having a common goal (friendship) unites you and the other person, making you more likely to value them and fulfill their desires.)
The Sunk Cost Fallacy
Another error in thinking that affects how you value things is the sunk cost fallacy. According to Dobelli, the more time, effort, or resources you invest in something, the higher you value that thing. You'll also be more resistant to parting with it, even if keeping it means losing more time, effort, or resources in the future.
(Shortform note: This fallacy stems from a fear of waste: Most people try not to waste time, money, or effort, and letting go of something you’ve invested resources in feels like wasting those resources. While this is technically true—it is a waste of time, money, or effort—continuing to invest resources only creates more waste.)
You can overcome this fallacy by focusing on whether something is serving you in the present and will continue to do so in the future, rather than focusing on what you’ve invested in the past. (Shortform note: Dobelli’s suggestion to focus on the future doesn’t mean ignoring the past: Consider the past to make good decisions based on all the data you’ve collected, but don’t let past effort stop you from moving on.)
Want to learn the rest of The Art of Thinking Clearly in 21 minutes?
Unlock the full book summary of The Art of Thinking Clearly by signing up for Shortform .
Shortform summaries help you learn 10x faster by:
- Being 100% comprehensive: you learn the most important points in the book
- Cutting out the fluff: you don't spend your time wondering what the author's point is.
- Interactive exercises: apply the book's ideas to your own life with our educators' guidance.
Here's a preview of the rest of Shortform's The Art of Thinking Clearly PDF summary: