PDF Summary:Thinking, Fast and Slow, by Daniel Kahneman
Book Summary: Learn the key points in minutes.
Below is a preview of the Shortform book summary of Thinking, Fast and Slow by Daniel Kahneman. Read the full comprehensive summary at Shortform.
1-Page PDF Summary of Thinking, Fast and Slow
We like to think that we’re intelligent, rational beings who generally make good decisions. However, psychologist Daniel Kahneman says that in reality, the human mind is hasty, imprecise, and lazy. In Thinking, Fast and Slow, Kahneman explains how we make decisions, why those decisions are often wrong, and how we can work around our natural shortcomings to make better decisions in the future.
We’ll begin by describing the “fast” and “slow” systems of thought that Kahneman identifies. Next, we’ll go over the ways our thinking tends to be sloppy and biased, and why that happens so regularly. Finally, we’ll examine Kahneman’s research into happiness, and how a better understanding of ourselves can help support our overall well-being.
In our commentary, we’ll explore some evolutionary origins of cognitive biases, look at how the biases Kahneman highlights relate to additional biases, and compare Kahneman’s insights to those from other psychologists, such as Malcolm Gladwell and Barbara Oakley.
(continued)...
(Shortform note: Anchoring bias may result in part from the phenomenon of psychological priming. When we are exposed to an idea, that idea activates, or primes, parts of our brain, and those parts of the brain stay active as we process further information. This can affect our thinking by anchoring us to the first idea we heard and to the mental connections we drew from it. To avoid anchoring bias, actively try to think of counterarguments or alternative options, and look for reasons why they might be better than the anchored information.)
Narrative fallacy: Kahneman says that people try to create coherent stories to explain random events. Then, because their stories sound plausible, people feel an unjustified level of confidence in their ability to predict future events. For example, two boxers might be so evenly matched that the outcome of their match could go either direction. However, sports pundits discussing the match will invent stories about how the loser buckled under the pressure, or the winner “wanted it more.” If those boxers were to have a rematch, the pundits will try to anticipate the winner based on the stories they previously created, even though the outcome will be just as unpredictable as before.
(Shortform note: The narrative fallacy comes from the natural human desire to understand and control, or at least predict, the world around you. For example, psychologists believe that conspiracy theories—extreme narrative fallacies that draw connections between totally unrelated events—are actually self-soothing anxiety responses. If there’s a group secretly orchestrating events (for instance, the Illuminati are popular scapegoats), that means nothing is ever truly random; therefore, any future disaster can be predicted and prepared for. While belief in a global, nearly omnipotent group like the Illuminati might seem terrifying, some people find comfort in having a tangible enemy to fight, rather than being at the mercy of random chance.)
Narrow framing: According to Kahneman, people tend to make decisions based on relatively small pools of information instead of considering the whole picture. There are numerous ways this fallacy can manifest. For example, in the planning fallacy, people tend to overlook all the ways a project could go wrong, and therefore underestimate how much time it will require—they only factor in information about how long it will take in an ideal situation. Another example is the sunk cost fallacy, which happens when people narrow their focus to getting back what they’ve lost on a failed endeavor. However, if they consider all of their options, they’ll see it’s better to cut their losses and invest their resources elsewhere.
(Shortform note: Narrow framing may be the unavoidable result of working memory’s natural limitations. Your working memory—where your brain stores ideas that you’re currently using to solve a problem or make a decision—can only hold a few pieces of information at once. Researchers disagree on exactly how many ideas you can hold at once, and it varies from person to person, but common estimates place the average working memory capacity at somewhere from two to four ideas. As a result, regardless of whether you’re using System 1 or System 2 thinking, it’s simply not possible to consider everything when making a decision.)
Two Theories of Decision-Making
So far we’ve discussed the two ways people think as well as a number of ways our conclusions can be imprecise and biased. Now let’s explore how Kahneman uses those principles to explain, in broad terms, why people don’t always make the best decisions. We’ll start by explaining expected utility theory, that traditional view that assumes people are rational and will always make the choices that benefit them the most. We’ll then introduce Kahneman’s prospect theory, which takes into account the ways that emotions affect people’s judgement.
Expected Utility Theory Assumes People Are Perfectly Logical
The traditional theory of decision-making, known as expected utility theory, asserts that people rationally calculate how much they stand to gain or lose in each potential situation. Then, based on those calculations, they make the choice that’s most likely to lead to the greatest personal benefit.
However, as we’ve already discussed, people are not purely rational actors. Therefore, Kahneman argues that expected utility theory is not an effective way to explain people’s actions.
For example, suppose someone presents you with two options:
- An 80% chance to win $100, with a 20% chance to win only $10
- A 100% chance to win $80
If you calculate the average outcome of the first option based on probability, its expected value is greater ($82 versus $80). Therefore, according to utility theory, people should always choose it. However, Kahneman says most people will choose the second option because they prefer certainty over the chance to win more money.
Expected Utility Vs. Expected Value
Kahneman describes the flaws of expected utility theory at length, but it should be noted that this theory was, itself, a response to an even earlier theory called expected value.
Expected utility theory emerged as an answer to a thought experiment called the St. Petersburg Paradox: a game of chance where, in theory, you could win an infinite amount of money, but are much more likely to win very little. Expected value theory states that, since there’s a chance of winning an infinite amount of money, it’s rational to pay any amount of money to play. However, this is clearly flawed reasoning: Nobody would pay billions of dollars just for a miniscule chance of winning even more.
Expected utility theory tries to resolve the paradox by pointing out that, beyond a certain point, more money is not useful; therefore, even infinite money has finite utility. Now that there’s a limit to the game’s potential benefit, it makes rational sense to weigh that against how likely you are to lose money when you play, and decide how much a round of this game is actually worth to you.
Prospect Theory Accounts for Human Emotions
If people don’t make decisions based on pure rationality and expected value, how do they make decisions? To answer that question, Kahneman developed prospect theory, which factors in various ways that emotions influence judgment.
The author summarizes prospect theory in three points:
1. You evaluate situations by comparing them to a neutral reference point. Your reference point is usually your status quo—the situation you normally experience—meaning you make decisions that you think will improve your status quo. However, your reference point can also be an outcome you expect or feel entitled to, like an annual raise. That’s why you can feel crushed when you don’t get something you expected even though your situation hasn’t actually changed.
(Shortform note: Arguably, your status quo is also an expectation: You expect your current situation to continue. So, we can sum up this aspect of prospect theory using the “equation” happiness equals reality minus expectations. This formula explains why you’re unhappy when reality doesn’t live up to your expectations: Your happiness is negative. Thus, according to this “formula,” the simplest way to become happier is to lower your expectations. In theory, if expectations = 0 (that is, if you have no expectations at all) you’ll be happy with whatever reality you have. That’s one way to approach what Tara Brach calls Radical Acceptance—not expecting or anticipating anything, but simply accepting each moment as it comes.)
2. Your evaluations are proportional, rather than fixed. Kahneman explains that you judge value as a percentage of what you already have. For example, rationally speaking, gaining $100 should always have the exact same value. However, going from $100 to $200 (a 100% increase) feels much more significant than going from $1,000 to $1,100, which is only a 10% increase. To extend the example, if you were already a millionaire, $100 wouldn’t even seem worth your notice—it would be a fraction of a percent of what you already have, and barely feel different from finding a quarter on the ground.
(Shortform note: These proportional evaluations make more sense when you consider that you often don’t need, nor even want, more of something that you already have a great deal of. This idea is closely related to what economists call the Law of Diminishing Marginal Utility: The more of something you have, the less benefit you experience from getting one more of that thing. This is why $1 is practically worthless to someone who already has a lot of money, but for someone in poverty, one more dollar might allow them to afford a meal or some other necessity. For another example, suppose you’re very thirsty; you’d greatly value a glass of water, but a second glass would hold significantly less benefit for you, and a third would have less value still.)
3. Losses of a certain amount trigger stronger emotions than a gain of the same amount. To give a fairly mundane example, the happiness you feel when a bartender hands you a drink is much less than the disappointment you’d feel if you spilled it. Kahneman says this phenomenon, called loss aversion, is a result of evolution: Organisms that treat threats more urgently than opportunities tend to survive and reproduce better.
(Shortform note: While Kahneman presents loss aversion as an irrational bias in our thinking, statistician Nasim Nicholas Taleb argues that it’s actually extremely rational. In Skin in the Game, Taleb says loss aversion is a symptom of our instinct to prevent ruin—a loss so great that it’s impossible to recover from. Taleb notes that opportunities for ruin-inducing losses are all around us, and that small losses can add up over time and ruin us. Therefore, it’s rational that we have strong emotions regarding losses: Logically, we should do everything we can to avoid even the smallest possibility of ruin.)
Kahneman also discusses some practical implications of prospect theory. Two key implications are the possibility effect and the certainty effect.
Implication #1: The Possibility Effect
Kahneman says that people will overvalue the mere possibility of something happening, even if it’s still highly unlikely.
As an example, consider which of these options seems more meaningful:
- Going from a 0% chance of winning $1 million to a 5% chance
- Going from a 5% chance of winning $1 million to a 10% chance
Most likely you had a stronger response to the first option, even though the objective increase in value is the same for both.
The possibility effect explains why people fantasize about small chances of big gains—such as when they go to casinos or play the lottery. It also explains why people obsess over worst-case scenarios, even when there’s only a tiny chance of those scenarios coming to pass.
The Possibility Effect Makes Sense for Extreme Events
Similar to our previous discussion of ruin, the Possibility Effect may be more rational than it first appears, especially when dealing with extreme situations.
When assessing the risk level of a negative event, it’s crucial to factor in both the likelihood of that event happening and its impact if it does happen. Therefore, going from no chance of an event happening (zero risk) to even a small chance (some risk) is, proportionally, an infinite increase in risk. For example, this is why many people argue against building nuclear power plants: While the odds of another event like Chernobyl are very low, the impact would be so catastrophic that people find even the possibility unacceptable.
We can also invert this idea to account for positive events, such as the above example of winning $1 million—going from a 0% to a 5% chance is an infinite increase in your odds of getting rich. Conversely, going from a 5% chance to a 10% chance is much less impactful; the possibility already existed, now it’s just slightly more likely to happen.
Implication #2: The Certainty Effect
Just like people overvalue the idea of a previously impossible outcome becoming merely unlikely, Kahneman says we also place too much value on the idea of a likely outcome becoming certain.
To illustrate this, consider the following situations:
- You’re in the hospital, and your prognosis goes from a 90% chance of recovery to a 95% chance of recovery.
- You’re in the hospital, and your prognosis goes from a 95% chance of recovery to a 100% chance of recovery.
Most likely, you felt better about the second than the first. This demonstrates how people overvalue absolute certainty, and undervalue outcomes that are almost guaranteed but not quite certain. A 95% chance of recovery is actually a great prognosis, but it doesn’t feel that way because the remaining 5% continues to bother you.
(Shortform note: Neuroscience can provide an explanation for why we value certainty so highly. When we face uncertain situations, specific regions in the brain become hyperfocused on potential threats. When this happens, uncertainty consumes parts of our working memory that we’d otherwise use for focus, creativity, and decision-making. Therefore, by eliminating uncertainty, we ease our anxiety and become better able to make thoughtful, well-reasoned choices. In other words, certainty doesn’t just make us feel more confident in our decisions, in many cases it actually allows us to make better decisions than we could while distracted by possible dangers and “what-if” thoughts.)
Happiness and the Two Selves
So far we’ve discussed Kahneman’s two systems of thought, as well as a number of ways in which our thinking tends to be biased and irrational. Now we’ll explore how those principles culminate in two distinctly different “selves” within each of us—a theory that Kahneman developed while researching the topic of happiness.
We’ll begin this final section by explaining Kahneman’s concept of two selves: the experiencing self and the remembering self. We’ll then describe various ways in which the remembering self (and our tendency to focus too heavily on it) skews our reasoning about our own happiness and well-being. We’ll conclude with Kahneman’s suggestion that both selves are important, and that we must learn how to balance their needs.
The Experiencing Self and the Remembering Self
Kahneman identifies two distinct aspects of how we process happiness and experience:
The experiencing self lives moment-to-moment, feeling pleasure and pain as it happens. This self measures happiness by keeping a running total of your positive and negative feelings as they occur; the more positive the “sum,” the happier you are.
(Shortform note: As we’ll discuss, people tend to focus very heavily on the remembering self and neglect the experiencing self. One way to get more in touch with your experiencing self is to practice mindfulness meditation, which trains you to accept each moment-to-moment experience as it happens, then let it fade away naturally—you don’t judge the experiences as “good” or “bad,” and therefore you don’t try to consciously remember the good experiences or block out the bad ones.)
In contrast, the remembering self reflects on past events, and only evaluates them once they’ve passed. As a result, it measures happiness very differently from the experiencing self. Kahneman identifies two key patterns that the remembering self uses to evaluate past events:
1. The peak-end rule: This measurement depends mainly on an event’s peak intensity (positive or negative) and how it ends, not an overall average of how it felt. For example, a musical with one excellent song and a strong ending is likely to earn good reviews, even if the majority of the show is mediocre.
(Shortform note: The peak-end rule has significant implications for how we think about our relationships. We tend to view relationships—whether current or past—through the lens of a few key moments, and evaluate the entire relationship as positive or negative based only on those moments. However, you can intentionally create more of those moments to help you remember the entire relationship more accurately. One effective way to do this is simply by diversifying the experiences you and your partner share together. For instance, instead of going to the same bar or restaurant every time you go out, make a point of trying out places you’ve never been to; each of those new experiences will have its own peak and end moments to remember.)
2. Duration neglect: How long something lasts has little impact on how we remember it. For instance, two people with similarly painful injuries (say, ankle sprains of equal intensity) will feel roughly the same about those injuries after the fact, even if one person only needed a month to heal while the other took six months to get back to full health.
Counterpoint: Duration Has an Indirect, but Significant Impact
As a counterpoint to Kahneman, a study from 2020 found that the duration of an experience does have a significant impact on how we remember it, but the effect is indirect. While the researchers agree that people only remember key moments about an experience, how long the experience lasts changes how they remember those moments.
Applying this study to the previous example, two people with painful injuries will have similar memories about their experiences: They’ll remember only a few key moments, regardless of how long they were injured and in pain for. However, they’d remember those key moments differently. For instance, the person who took six months to heal might remember their most painful moments as significantly worse, because when recalling them, they felt fatigued and frustrated (due to their lengthy healing process). The person who only took one month to heal might remember those painful moments as being less severe, as they were better able to cope with those moments. Conversely, the person who took longer to heal might remember those moments less intensely because they’d gotten so accustomed to the pain already.
The Remembering Self Skews Our Judgement
Kahneman says that, because of its reliance on System 1 thinking, the remembering self distorts how we measure our own happiness, leading us to believe our lives are better or worse than they really are. Furthermore, because we make decisions by using our memories as reference points, we tend to heavily weigh decisions toward the remembering self and overlook the needs of the experiencing self.
Some key flaws in the remembering self’s reasoning include:
Needless suffering: People often make choices that cause the experiencing self to suffer, but end with rewards that the remembering self will enjoy (due to the peak-end rule). For example, boxers regularly suffer through harsh training and brutal fights just for the chance of a memorable victory. While this may seem reasonable if good memories will strongly outweigh bad ones, Kahneman argues that the experiencing self’s pain still negatively impacts your overall happiness, and it’s better to avoid such suffering whenever possible.
(Shortform note: Objectively speaking, your past experiences can definitely impact your current happiness and well-being—consider how a sprained ankle affects your happiness regardless of whether you know how you hurt it. To further illustrate Kahneman’s point, research shows that people who experienced childhood trauma can suffer from ongoing physical and mental PTSD symptoms, even if they don’t remember the traumatic experiences. Similarly, you might have aches and pains from old injuries you don’t remember getting, or certain situations might put you on edge for reasons you can’t explain.)
The Focusing Illusion: When trying to evaluate their overall happiness, people place too much emphasis on whatever’s currently on their mind—in other words, whatever the remembering self is thinking about at the moment. For example, suppose a couple is going through a rough patch; they might evaluate their entire marriage as a net negative because they’re only remembering the problems they’ve recently had, even though both partners are generally happy with each other. On the other hand, if someone specifically asks how good their marriage is when they’re not fighting, that would shift their focus and probably change their answer.
(Shortform note: One possible way to mitigate the focusing illusion is to consider the same issue while you’re in various different moods or frames of mind. Apocryphally, ancient Persians used to deliberate over important decisions twice: once while drunk and once while sober. They did this to shift their focus and consider different aspects of a situation. Supposedly, only a decision that seemed wise in both states (drunk and sober) was acceptable. While it may not be advisable to get drunk every time you have to evaluate a situation, there is value in reconsidering a decision later, especially one made in an emotional moment—you may find that you make a different choice once your feelings have settled and your focus has shifted.)
Inaccurate Predictions: Kahneman says that people consistently overestimate how much changes will affect their future happiness (positively or negatively) because their current remembering self overestimates how much their future experiencing self will think about those changes. In reality, people quickly adapt to new circumstances and stop thinking about them at all. For example, people commonly think they’d be happier if they had more money. However, once they achieve their financial goals, that level of wealth becomes their new normal and their happiness settles to the same level it was at previously, at their old normal.
(Shortform note: These inaccurate predictions about the future lead to a phenomenon that psychologists call the hedonic treadmill: People chase after something they think will make them happy, enjoy a moment of pleasure when they get it, but quickly return to their previous level of happiness. They then start chasing the next thing in order to recapture that feeling. It’s referred to as a treadmill because people constantly “run” after happiness, but always end up in the same place emotionally. The same is true of negative experiences—after a brief period of feeling upset or angry, people go back to feeling how they did before.)
Conclusion: Both Selves Are Important
Kahneman urges you to find a balance between these two selves, because focusing too much on one or the other creates problems.
Focusing only on the remembering self invites unnecessary suffering. If you only value the remembering self, you might endure decades of pain in the hopes of a brief period of happiness at the end. Conversely, you might avoid long periods of happiness because you’re afraid they’ll end poorly.
For example, you may choose a career you don’t enjoy simply because it pays well. You would then devote the majority of your life to something that makes you unhappy because you think it will allow you to enjoy the relatively brief period between your retirement and your death.
On the other hand, Kahneman warns that focusing only on the experiencing self ignores the potential for lasting harm that some moments can bring. Therefore, this approach can lead to shortsighted decisions that maximize your immediate pleasure while harming your future self.
In short, both selves are important. To maximize your well-being and happiness, Kahneman says you must consider the needs of both your experiencing self and your remembering self. This means you must weigh the moment-to-moment experiences of living against the long-term value you’ll derive from your memories, and find a balance where you can be satisfied both in the present and in the future.
Be Content With the Present, Yet Excited for the Future
Kahneman’s advice to balance the needs of your two selves echoes what Daniel Z. Lieberman and Michael E. Long say in The Molecule of More: Long-term happiness requires you to balance excitement for future possibilities with contentment with your present circumstances.
The authors explain that many people spend their lives pursuing “more”—more money, more possessions, more extreme experiences, and so on—because getting what they want provides a pleasurable rush of dopamine. However, dopamine keeps you focused on future possibilities; you need to engage more present-focused areas of your brain to enjoy what you already have.
Lieberman and Long say the easiest way to find this balance is to look for a career or hobby that demands your full attention in the present, but also gives you future goals to work toward. This works because activities that keep you focused will stop you from thinking about the future, while having milestones to look forward to helps satisfy the dopamine-driven urge for “more.”
Painting is an excellent example of this. Each brushstroke demands the artist’s full attention, yet the artist also needs to have an idea of what the painting will look like when it’s finished—that final product is the goal they’re working toward.
Want to learn the rest of Thinking, Fast and Slow in 21 minutes?
Unlock the full book summary of Thinking, Fast and Slow by signing up for Shortform.
Shortform summaries help you learn 10x faster by:
- Being 100% comprehensive: you learn the most important points in the book
- Cutting out the fluff: you don't spend your time wondering what the author's point is.
- Interactive exercises: apply the book's ideas to your own life with our educators' guidance.
Here's a preview of the rest of Shortform's Thinking, Fast and Slow PDF summary:
PDF Summary Part 1-1: Two Systems of Thinking
...
System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on.
System 1 can arise from expert intuition, trained over many hours of learning. In this way a chess master can recognize a strong move within a second, where it would take a novice several minutes of System 2 thinking.
System 2 requires attention and is disrupted when attention is drawn away. More on this next.
System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions.
System 1 can detect errors and recruits System 2 for additional firepower.
- Kahneman tells a story of a veteran firefighter who entered a burning house with his crew, felt something was wrong, and called for them to get out. The house collapsed shortly after. He only later realized that his ears were unusually hot but the fire was unusually quiet, indicating the fire was in the basement.
Because System...
PDF Summary Part 1-2: System 2 Has a Maximum Capacity
...
Kahneman cites one particular task as the limit of what most people can do in the lab, dilating the pupil by 50% and increasing heart rate by 7bpm. The task is “Add-3”:
- Write several 4 digit numbers on separate index cards.
- To the beat of a metronome, read the four digits aloud.
- Wait for two beats, then report a string in which each of the original digits is incremented by 3 (eg 4829 goes to 7152).
If you make the task any harder than this, most people give up. Mentally speaking, this is sprinting as hard as you can, whereas casual conversation is a leisurely stroll.
Because System 2 has limited resources, stressful situations make it harder to think clearly. Stressful situations may be caused by:
- Physical exertion: In intense exercise, you need to apply mental resistance to the urge to slow down. Even a physical activity as relaxed as taking a stroll involves the use of mental resources. Therefore, the most complex arguments might be done while sitting.
- The presence of distractions.
- Exercising self-control or willpower in general.
Because of the fixed capacity, you cannot will yourself to think harder in the moment and surpass the...
PDF Summary Part 1-3: System 1 is Associative
...
Ready?
You might have found that the second one felt better. Isn’t that odd? There is a very faint signal from the associative machine of System 1 that says “these three words seem to connect better than the other three.” This occurred long before you consciously found the word (which is sea).
Association with Context
For another example, consider the sentence “Ana approached the bank.”
You automatically pictured a lot of things. The bank as a financial institution, Ana walking toward it.
Now let’s add a sentence to the front: “The group canoed down the river. Ana approached the bank.”
This context changes your interpretation automatically. Now you can see how automatic your first reading of the sentence was, and how little you questioned the meaning of the word “bank.”
Associations Evaluate Surprise
The purpose of associations is to prepare you for events that have become more likely, and to evaluate how surprising the event is.
The more external inputs associate with each other, and the more they...
What Our Readers Say
This is the best summary of Thinking, Fast and Slow I've ever read. I learned all the main points in just 20 minutes.
Learn more about our summaries →PDF Summary Part 1-4: How We Make Judgments
...
2) Mental Shotgun
System 1 often carries out more computations than are needed. Kahneman calls this “mental shotgun.”
For example, consider whether each of the following three statements is literally true:
- Some roads are snakes.
- Some jobs are snakes.
- Some jobs are jails.
All three statements are literally false. The second statement likely registered more quickly as false to you, while the other two took more time to think about because they are metaphorically true. But even though finding metaphors was irrelevant to the task, you couldn’t help noticing them - and so the mental shotgun slowed you down. Your System-1 brain made more calculations than it had to.
Heuristics: Answering an Easier Question
Despite all the complexities of life, notice that you’re rarely stumped. You rarely face situations as mentally taxing as having to solve 9382 x 7491 in your head.
Isn’t it profound how we can make decisions at all without realizing it? You like or dislike people before you know much about them; you feel a company will succeed or fail without really analyzing it.
When faced with a difficult question, System 1 substitutes an easier question, or...
PDF Summary Part 1-5: Biases of System 1
...
In other words, your emotional response fills in the blanks for what’s cognitively missing from your understanding.
The Halo Effect forms a simpler, more coherent story by generalizing one attribute to the entire person. Inconsistencies about a person, if you like one thing about them but dislike another, are harder to understand. “Hitler loved dogs and little children” is troubling for many to comprehend.
Ordering Effect
First impressions matter. They form the “trunk of the tree” to which later impressions are attached like branches. It takes a lot of work to reorder the impressions to form a new trunk.
Consider two people who are described as follows:
- Amos: intelligent, hard-working, strategic, suspicious, selfish
- Barry: selfish, suspicious, strategic, hard-working, intelligent
Most likely you viewed Amos as the more likable person, even though the five words used are identical, just differently ordered. The initial traits change your interpretation of the traits that appear later.
This explains a number of effects:
- Pygmalion effect: A person’s expectation of a target person affects the target...
PDF Summary Part 2: Heuristics and Biases | 1: Statistical Mistakes
...
If it’s not lifestyle, what’s the key factor here? Population size. The outliers in the high-cancer areas appeared merely because the populations were so small. By random chance, some rural counties would have a spike of cancer rates. Small numbers skew the results.
Case 2: Small Classrooms
The Gates Foundation studied educational outcomes in schools and found small schools were habitually at the top of the list. Inferring that something about small schools led to better outcomes, the foundation tried to apply small-school practices at large schools, including lowering the student-teacher ratio and decreasing class sizes.
These experiments failed to produce the dramatic gains they were hoping for.
Had they inverted the question - what are the characteristics of the worst schools? - they would have found these schools to be smaller than average as well.
When falling prey to the Law of Small Numbers, System 1 is finding spurious causal connections between events. It is too ready to jump to conclusions that make logical sense but are merely statistical flukes. **With a surprising result, we immediately skip to understanding causality rather than...
PDF Summary Part 2-2: Anchors
...
Sometimes, the anchor works because you infer the number is given for a reason, and it’s a reasonable place to adjust from. But again even meaningless numbers, even dice rolls, can anchor you.
The anchoring index measures how effective the anchor is. The index is defined as: (the difference between the average guesses when exposed to two different anchors) / (the difference between the two anchors). Studies show this index can be over 50%! (A measure of 100% would mean the person in question is not only influenced by the anchor but uses the actual anchor number as their estimate; conversely, a measure of 0% would indicate the person has ignored the anchor entirely.)
- For example, one study asked Group A two questions: 1) Is the tallest redwood taller or shorter than 1,200 feet? and 2) What do you think the height of the tallest redwood is? They asked Group B the same two questions, except the anchor in the first question was 180 feet rather than 1,200 feet.
- Did the anchors in the first question affect the estimates given in answer to the second question? Yes. The Group A mean was 844 feet and the Group B mean was 282 feet. This produced an...
PDF Summary Part 2-3: Availability Bias
...
- Items that are covered more in popular media take on a greater perceived importance than those that aren’t, even if the topics that aren’t covered have more practical importance.
As we’ll discuss later in the book, availability bias also tends to influence us to weigh small risks as too large. Parents who are anxiously waiting for their teenage child to come home at night are obsessing over the fears that are readily available to their minds, rather than the realistic, low chance that the child is actually in danger.
Availability Bias and the Media
Within the media, availability bias can cause a vicious cycle where something minor gets blown out of proportion:
- A minor curious event is reported. A group of people overreact to the news.
- News about the overreaction triggers more attention and coverage of the event. Since media companies make money from reporting worrying news, they hop on the bandwagon and make it an item of constant news coverage.
- This continues snowballing as increasingly more people see this as a crisis.
- Naysayers who say the event is not a big deal are rejected as participating in a coverup.
- Eventually, all of this can affect...
Why are Shortform Summaries the Best?
We're the most efficient way to learn the most useful ideas from a book.
Cuts Out the Fluff
Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?
We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.
Always Comprehensive
Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.
At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.
3 Different Levels of Detail
You want different levels of detail at different times. That's why every book is summarized in three lengths:
1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example
PDF Summary Part 2-4: Representativeness
...
Again, by pure number of people, there are far more people in the latter group than the former.
Why Do We Suffer from the Representativeness Heuristic?
Representativeness is used because System 1 desires coherence, and matching like to like forms a coherent story that is simply irresistible.
The representativeness heuristic works much of the time, so it’s hard to tell when it leads us astray. Say you’re shown an athlete who’s thin and tall, then asked which sport he plays. You’d likely guess basketball more than football, and you’d likely be correct.
(Shortform note: the representativeness heuristic causes problems when your System 1 forms a coherent story that is inaccurate. Common problems involve stereotypes that cause incorrect snap judgments:
- When hiring for a role, you might hire based on a stereotype of how that role should behave, rather than the work the person does. For example, if you expect engineers to be plain and soft-spoken, a candidate who’s fashionable and outgoing might strike you as suspicious.
- While in the Israeli army, Kahneman was tasked with evaluating which recruits would be best suited for officer positions. He confidently gave...
PDF Summary Part 2-5: Overcoming the Heuristics
...
Are Smart Predictions Always Good?
Kahneman notes that absence of bias is not always what matters most. Relying too much on statistics would avoid the prediction of rare or extreme events on shaky information.
For example, venture capitalists make their money by correctly predicting the few companies that will make it big. They lose only 1x their money on a bad investment but make 1000x their money on a Google, so it’s important not to miss out on the next Google. However, using the type of quantitative analysis above might paralyze some investors, if they start with the baseline failure rate of startups (which is very high) and have to adjust upward from that anchor. For some people prone to paralysis, having distorted overoptimistic evidence might be better.
Similarly, sometimes the evidence is against you but your choice feels right, like when you know the high divorce rate as you’re about to get married. In these cases, you might be happier deluding yourself with extreme predictions—”our marriage is going to defy the odds.” Listening to our intuitions is more pleasant and less hard work than acting consciously against them. **But at the least, be aware of what...
PDF Summary Part 3: Overconfidence | 1: Flaws In Our Understanding
...
* The book _Built to Last _profiled companies at the top of their field; these companies did not outperform the market after the book was published.
- Management literature also tries to find patterns to management systems that predict success. Often, the results are disappointing and not enduring.
- The correlation between a firm’s success and the quality of its CEO might be as high as .30, which is lower than what most people might guess. Practically, this correlation of .30 suggests that the stronger CEO would lead the stronger firm in 60% of pairs, just 10% better than chance.
- We readily trust our judgments in situations that are poor representatives of real performance (like in job interviews).
Funnily, in some situations, an identical explanation can be applied to both possible outcomes. Some examples:
- During a day of stock market trading, an event might happen, like the Federal Reserve System lowering interest rates. If the stock market goes up, people say investors are emboldened by the move. If the stock market goes down, people say investors were expecting greater movement, or that the market had already priced it in. That the same...
PDF Summary Part 3-2: Formulas Beat Intuitions
...
- How do you predict whether newborns are unhealthy and need intervention? A while back, doctors used their (poor) judgment. Instead, in 1952 doctor Virginia Apgar invented the Apgar score, a simple algorithm that takes into account 5 factors, such as skin color and pulse rate. This is still in use today.
- How do you predict wine prices for a bottle of wine? Traditionally, wine enthusiasts tasted the bottle, then assigned a hypothetical price. Instead, an economist used only two variables—the summer temperature and rainfall of the vintage. This was more accurate than humans, but wine experts were aghast—“how can you price the wine without tasting it?!” The formula was in fact better because it did not factor in human taste.
There is still some stigma about life being pervaded by robotic algorithms, removing some of the romance of life.
- Professionals who use intuition to predict things feel outrage when algorithms encroach on their profession. They feel many of their predictions do turn out correct and that they have skill, but their downfall is they don’t know the boundaries of their skill.
- It seems more heart-wrenching to lose a child due to an algorithm’s...
PDF Summary Part 3-3: The Objective View
...
When estimating for a project, you tend to give “best case scenario” estimates, rather than confidence ranges. You don’t know what you don’t know about what will happen—the emergencies, loss of motivation, and obstacles that will pop up—and you don’t factor in buffer time for this.
Kahneman gives an example of a curriculum committee meeting to plan a book. They happily estimate 2 years for completion of the book. Kahneman then asks the editor how long other teams have taken. The answer is 7-10 years, with 40% of teams failing to finish at all. Kahneman then asks how their team skill compares to the other teams. The answer is Kahneman’s team is below average.
This was an astounding example of how a person may have relevant statistics in her head, but then completely fails to recall this data as relevant for the situation. (The book did indeed take 8 years.)
Furthermore, before Kahneman asked his questions, the team didn’t even feel they needed information about other teams to make their guess! They looked only at their own data situation.
Government projects have a funny pattern of being universally under budget and delayed. (Though there may be an underlying...
PDF Summary Part 4: Choices | 1: Prospect Theory
...
Bernoulli then argued that utility and wealth had a logarithmic relationship. The difference in happiness between someone with $1,000 and someone with $100 was the same as $100 vs $10. On a linear scale, money has diminishing marginal utility.
This concept of logarithmic utility neatly explained a number of phenomena:
- This meant that $10 was worth more to someone with $20 than to someone with $200. This aligns with our intuition - people with more money are less excited than poorer people about the same amount of money.
- This explained the value of certainty in gamble problems, like the 80% chance question above. On a logarithmic scale for utility, having 100% of $80 was better than having 80% of $100.
- This also explained insurance - people with less wealth were willing to sell risk to the wealthier, who would suffer less relative utility loss in the insured loss.
Despite its strengths, this model presented problems in other cases. Here’s an extended example.
Say Anthony has $1 million and Beth has $4 million. Anthony gains $1 million and Beth loses $2 million, so they each now have $2 million. Are Anthony and Beth equally happy?
Obviously not - Beth...
PDF Summary Part 4-2: Implications of Prospect Theory
...
We Feel Better About Absolute Certainty
We’ve covered how people feel about small chances. Now consider how you feel about these options on the opposite end of probability:
- In a surgical procedure, going from 90% success rate to 95% success rate.
- In a surgical procedure, going from 95% success rate to 100% success rate.
Most likely, you felt better about the second than the first. Outcomes that are almost certain are given less weight than their probability justifies. 95% success rate is actually fantastic! But it doesn’t feel this way, because it’s not 100%.
As a practical example, people fighting lawsuits tend to take settlements even if they have a strong case. They overweight the small chance of a loss.
(Shortform note: how we feel about 0% and 100% are similar and are inversions of each other. A 100% gain can be converted into 0% loss—we feel strongly about both. For example, say a company has a 100% chance of failure, but a new project reduces that to 99%. It feels as though the chance of failure is reduced much more than 1%. Inversely, a project that increases the rate of success from 0% to 1% seems much more likely to work than 1%...
PDF Summary Part 4-3: Variations on a Theme of Prospect Theory
...
* Notably, a third group (Choosers) could either receive a mug or a sum of money, and they indicated that receiving $3.12 was as desirable an option as receiving the mug. It’s important to note that the mug owners had been given an essentially identical choice: they were either going to leave with the mug or with the money someone paid for it. But because they already possessed the mug, they vastly embellished their ask.
- Owners who buy a house at higher prices spend longer trying to sell their home and set a higher listing price - even though the current market value is rationally all that matters.
The endowment effect doesn’t occur in all cases - people are willing to exchange $5 for five $1 bills, and furniture vendors are happy to exchange a table for money. When the asset under question is held for exchange, the endowment effect doesn’t apply.
You only feel endowed with items that are planned for consumption or use, like a bottle of wine or vacation days.
As with prospect theory and loss aversion, experienced financial traders show less attachment to the endowment effect.
Goals are Reference Points
**We are driven more to avoid failing a goal...
PDF Summary Part 4-4: Broad Framing and Global Thinking
...
- In a company, individual project leaders can be risk averse when leading their own project, since their compensation is heavily tied to project success. Yet the CEO overlooking all projects may wish that all project leaders take on the maximum appropriate risk, since this maximizes the expected value of the total portfolio.
- The opposite scenario may also happen: in a company, leaders of individual projects that are failing may be tempted to run an expensive hail-mary, to seek the small chance of a rescue (because of overweighting probabilities at the edges). In the broad framing, the CEO may prefer to shut down projects and redirect resources to the winning projects.
- A risk-averse defendant who gets peppered with frivolous lawsuits may be tempted to settle each one individually, but in the broad framing, this may be costly compared to the baseline rate at which it would win lawsuits (let alone settling lawsuits invites more lawsuits).
- Appliance buyers may buy individual appliance insurance, when the broad framing of all historical appliances may show this is clearly unprofitable for individuals.
Antidote to Narrow Framing
To overcome narrow framing, adopt...
PDF Summary Part 5-1: The Two Selves of Happiness
...
But the remembering self evaluates differently from the experiencing self in two critical ways:
- Peak-end rule: The overall rating is determined by the peak intensity of the experience and the end of the experience. It does not care much about the averages throughout the experience.
- Duration neglect: The duration of the experience has little effect on the memory of the event.
Both effects operate in classic System 1 style: by averages and norms, not by sums.
This leads to preferences that the experiencing self would find odd, and show that we cannot trust our preferences to reflect our interests.
In the ice water experiment, participants were asked to stick their hand in cold water, then to evaluate their experience. Participants stuck their hand in cold water in two episodes: 1) a short episode: 60 seconds in 14°C water, and 2) a long episode: 60 seconds in 14°C, plus an additional 30 seconds, during which the temperature increased to 15°C. They were then asked which they would repeat for a third trial.
The experiencing self would clearly consider the long episode worse—you’re suffering for more time. But the longer episode had a more...
PDF Summary Part 5-2: Experienced Well-Being vs Life Evaluations
...
* Things that affect mood: coworker relations, loud noise, time pressure, a boss hovering around you.
* Things that do not affect mood: benefits, status, pay.
- Some activities generally seen as positive (like having a romantic partner) don’t improve experienced well-being. This might be partially because of tradeoffs—women in relationships spend less time alone, but they also have less time with friends. They spend more time having sex, but they also spend more time doing housework and caring for children.
Suggestions for Improving Experienced Well-being
How can you improve your moment-to-moment happiness?
- Focus your time on what you enjoy. Commute less.
- To get pleasure from an activity, you must notice that you’re doing it. Avoid passive leisure time in places like TV, and spend more time in active leisure time, like socializing and exercise.
Reducing the U-index should be seen as a worthwhile societal goal. Reducing the U-index by 1% across society would be a huge achievement, with millions of hours of avoided suffering.
Experienced Well-Being vs Life Evaluations
Where well-being is measured by methods like the Day Reconstruction...
PDF Summary Shortform Exclusive: Checklist of Antidotes
...
* If you’re given one extreme number to adjust from, repeat your reasoning with an extreme value from the other direction. Adjust from there, then average your final two results.
- Availability bias
- Force yourself to lower the weight on more available things. Especially be wary of things you see repeated often in news or advertising, and of things that strongly trigger extreme emotions.
- Be aware of “common sense” intuitions that seem true merely because it’s repeated often, like “searing meat seals in the juices.”
- Representativeness
- To avoid the “Tom W. librarian” problem, use Bayesian statistics. Start by predicting the base rates, using whatever factual data you have. Then consider how the new data should influence the base rates.
- Conjunction fallacy
- When hearing a complicated explanation that has too many convenient assumptions or vivid details (like an investment pitch for a business or an explanation for what caused a phenomenon), be aware that each additional assumption lowers the likelihood of it being true.
- Hindsight/Outcome Bias
- Keep a journal of your current beliefs and what you estimate the outcomes to be. In...