This is a preview of the Shortform book summary of Thinking, Fast and Slow by Daniel Kahneman.
Read Full Summary

1-Page Summary1-Page Book Summary of Thinking, Fast and Slow

We like to think that we’re intelligent, rational beings who generally make good decisions. However, psychologist Daniel Kahneman says that isn’t the case; in reality, the human mind is shockingly hasty, imprecise, and lazy. In Thinking, Fast and Slow (2011), Kahneman explains how we make decisions, why those decisions are often wrong—or at least suboptimal—and how we can work around our natural shortcomings to make better decisions in the future.

Kahneman (1934-2024) was a psychologist who specialized in the study of human decision-making. His groundbreaking research on cognitive biases, heuristics, and behavioral economics earned him the 2002 Nobel Memorial Prize in Economic Sciences, making him one of the few non-economists to ever receive this honor. Kahneman is best known for co-developing the prospect theory of decision-making along with fellow psychologist Amos Tversky.

We’ll begin this guide by describing the two systems of thought that Kahneman identified within the human mind—the titular “fast” and “slow” methods of thinking. Next, we’ll go over a number of ways that our thought processes can be sloppy and biased, and why that happens so regularly. We’ll then discuss the prospect theory, which argues that people make decisions based largely on emotion rather than reason. Finally, we’ll conclude with a brief examination of Kahneman’s research into happiness, and how a better understanding of ourselves can support our overall well-being.

In our commentary, we’ll explore some evolutionary origins of cognitive biases, look at how the biases Kahneman highlights relate to additional biases, and compare Kahneman’s insights to those from other psychologists, such as Malcolm Gladwell and Barbara Oakley.

Two Systems of Thought

Kahneman defines two systems of the mind—the two different ways you think and make decisions. In this section, we’ll start by describing the quick, largely subconscious System 1, then move on to the slower and more rational System 2. We’ll also discuss why Kahneman believes that each system has its own fundamental flaw.

System 1: Thinking Fast

Kahneman explains that System 1 operates automatically and quickly, with little or no conscious effort and no sense of voluntary control. This system gives rise to your feelings and intuitions, and it suggests courses of action for your conscious mind to consider. In short, System 1 is Kahneman’s collective term for all the ways you think fast.

Some examples of System 1 thinking include your ability to tell that one object is farther away than another, to read and understand simple sentences, to detect the emotions in someone else’s voice, and to perform simple or habitual actions like driving a car on an empty road.

(Shortform note: System 1 thinking gives rise to what Malcolm Gladwell (Blink) calls unconscious thinking, better known as intuition or a “gut feeling.” Gladwell believes unconscious thinking is often as effective as conscious, rational thinking, with the added benefits of happening more quickly and being less likely to get derailed by stress or anxiety. He explains that your subconscious mind naturally takes in a huge amount of information—much more than you consciously process—filters out what’s irrelevant, and returns the best answer to your current problem.)

Kahneman further explains that System 1 operates through association: It rapidly connects what you see or experience to related concepts and patterns stored in your memory. By doing so, this mental system can generate impressions and judgments nearly instantly.

For instance, if you see someone frowning, System 1 thinking instantly associates that input with related concepts like anger, threat, and negativity. As a result, within milliseconds of seeing that expression, you’re primed to defend yourself, escape, or defuse the situation.

(Shortform note: Associative thinking like Kahneman describes here isn’t just fast, it’s also creative—in fact, in The Innovator’s DNA, Hal Gregersen, Jeff Dyer, and Clayton Christensen argue that associative thinking is the very foundation of creativity. They explain that creative ideas and innovations usually aren’t completely new concepts, but rather come from people connecting concepts in ways that nobody had connected them before.)

System 1 Is Inaccurate

Since you can’t stop to consciously think about everything you see, hear, or otherwise experience, you need System 1 to function in your everyday interactions. However, Kahneman adds that this system is impulsive and imprecise.

To continue the previous example, you might see someone frowning and, because of System 1 thinking, immediately assume the person is angry and that you’re in danger. However, they could be frowning for any number of reasons: Perhaps they’re simply thinking over a difficult problem, and the expression has nothing to do with you at all.

Trick questions provide perfect demonstrations of System 1’s fallibility. For instance: According to the Old Testament, how many of each type of animal did Moses take on the ark?

You most likely thought of the number two immediately. However, if you think carefully, you’ll realize the answer is actually zero—it was Noah who took animals on the ark, not Moses. Your associative System 1 thinking saw the words “animal” and “ark” and answered the question it thought was being asked, rather than the actual question.

System 1 Reliance Is an Evolutionary Holdover

If System 1 thinking is so unreliable, why do we depend on it so much? In large part, [the answer lies in our evolutionary...

Want to learn the ideas in Thinking, Fast and Slow better than ever?

Unlock the full book summary of Thinking, Fast and Slow by signing up for Shortform.

Shortform summaries help you learn 10x better by:

  • Being 100% clear and logical: you learn complicated ideas, explained simply
  • Adding original insights and analysis, expanding on the book
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.
READ FULL SUMMARY OF THINKING, FAST AND SLOW

Here's a preview of the rest of Shortform's Thinking, Fast and Slow summary:

Thinking, Fast and Slow Summary Part 1-1: Two Systems of Thinking

We believe we’re being rational most of the time, but really much of our thinking is automatic, done subconsciously by instinct. Most impressions arise without your knowing how they got there. Can you pinpoint exactly how you knew a man was angry from his facial expression, or how you could tell that one object was farther away than another, or why you laughed at a funny joke?

This becomes more practically important for the decisions we make. Often, we’ve decided what we’re going to do before we even realize it. Only after this subconscious decision does our rational mind try to justify it.

The brain does this to save on effort, substituting easier questions for harder questions. Instead of thinking, “should I invest in Tesla stock? Is it priced correctly?” you might instead think, “do I like Tesla cars?” The insidious part is, you often don’t notice the substitution. This type of substitution produces systematic errors, also called biases. We are blind to our blindness.

System 1 and System 2 Thinking

In Thinking, Fast and Slow, Kahneman defines two systems of the mind:

System 1: operates automatically and quickly, with little or no effort, and no...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 1-2: System 2 Has a Maximum Capacity

System 2 thinking has a limited budget of attention - you can only do so many cognitively difficult things at once.

This limitation is true when doing two tasks at the same time - if you’re navigating traffic on a busy highway, it becomes far harder to solve a multiplication problem.

This limitation is also true when one task comes after another - depleting System 2 resources earlier in the day can lower inhibitions later. For example, a hard day at work will make you more susceptible to impulsive buying from late-night infomercials. This is also known as “ego depletion,” or the idea that you have a limited pool of willpower or mental resources that can be depleted each day.

All forms of voluntary effort - cognitive, emotional, physical - seem to draw at least partly on a shared pool of mental energy.

  • Stifling emotions during a sad film worsens physical stamina later.
  • Memorizing a list of seven digits makes subjects more likely to yield to more decadent desserts.

Differences in Demanding Tasks

The law of least effort states that **“if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 1-3: System 1 is Associative

Think of your brain as a vast network of ideas connected to each other. These ideas can be concrete or abstract. The ideas can involve memories, emotions, and physical sensations.

When one node in the network is activated, say by seeing a word or image, it automatically activates its surrounding nodes, rippling outward like a pebble thrown in water.

As an example, consider the following two words:

“Bananas Vomit”

Suddenly, within a second, reading those two words may have triggered a host of different ideas. You might have pictured yellow fruits; felt a physiological aversion in the pit of your stomach; remembered the last time you vomited; thought about other diseases - all done automatically without your conscious control.

The evocations can be self-reinforcing - a word evokes memories, which evoke emotions, which evoke facial expressions, which evoke other reactions, and which reinforce other ideas.

Links between ideas consist of several forms:

  • Cause → Effect
  • Belonging to the Same Category (lemon → fruit)
  • Things to their properties (lemon → yellow, sour)

Association is Fast and Subconscious

In the next exercise, you’ll be shown three words....

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 1-4: How We Make Judgments

System 1 continuously monitors what’s going on outside and inside the mind and generates assessments with little effort and without intention. The basic assessments include language, facial recognition, social hierarchy, similarity, causality, associations, and exemplars.

  • In this way, you can look at a male face and consider him competent (for instance, if he has a strong chin and a slight confident smile).
  • The survival purpose is to monitor surroundings for threats.

However, not every attribute of the situation is measured. System 1 is much better at determining comparisons between things and the average of things, not the sum of things. Here’s an example:

In the below picture, try to quickly determine what the average length of the lines is.

thinking-fast-and-slow-lines.png

Now try to determine the sum of the length of the lines. This is less intuitive and requires System 2.

Unlike System 2 thinking, these basic assessments of System 1 are not impaired when the observer is cognitively busy.

In addition to basic assessments: System 1 also has two other...

Why people love using Shortform

"I LOVE Shortform as these are the BEST summaries I’ve ever seen...and I’ve looked at lots of similar sites. The 1-page summary and then the longer, complete version are so useful. I read Shortform nearly every day."
Jerry McPhee
Sign up for free

Thinking, Fast and Slow Summary Part 1-5: Biases of System 1

Putting it all together, we are most vulnerable to biases when:

  • System 1 forms a narrative that conveniently connects the dots and doesn’t express surprise.
  • Because of the cognitive ease by System 1, System 2 is not invoked to question the data. It merely accepts the conclusions of System 1.

In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.

In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in an weather emergency.

We’ll end part 1 with a collection of biases.

What You See is All There Is: WYSIATI

When presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can - it does not stop to examine the quality and the quantity of information.

In an experiment, three groups were given background to a legal case....

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2: Heuristics and Biases | 1: Statistical Mistakes

Kahneman transitions to Part 2 from Part 1 by explaining more heuristics and biases we’re subject to.

The general theme of these biases: we prefer certainty over doubt. We prefer coherent stories of the world, clear causes and effects. Sustaining incompatible viewpoints at once is harder work than sliding into certainty. A message, if it is not immediately rejected as a lie, will affect our thinking, regardless of how unreliable the message is.

Furthermore, we pay more attention to the content of the story than to the reliability of the data. We prefer simpler and coherent views of the world and overlook why those views are not deserved. We overestimate causal explanations and ignore base statistical rates. Often, these intuitive predictions are too extreme, and you will put too much faith in them.

This chapter will focus on statistical mistakes - when our biases make us misinterpret statistical truths.

The Law of Small Numbers

The smaller your sample size, the more likely you are to have extreme results. When you have small sample sizes, do NOT be misled by outliers.

A facetious example: in a series of 2 coin tosses, you are likely to get 100% heads....

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 2-2: Anchors

Anchoring describes the bias where you depend too heavily on an initial piece of information when making decisions.

In quantitative terms, when you are exposed to a number, then asked to estimate an unknown quantity, the initial number affects your estimate of the unknown quantity. Surprisingly, this happens even when the number has no meaningful relevance to the quantity to be estimated.

Examples of anchoring:

  • Students are split into two groups. One group is asked if Gandhi died before or after age 144. The other group is asked if Gandhi died before or after age 32. Both groups are then asked to estimate what age Gandhi actually died at. The first group, who were asked about age 144, estimated a higher age of death than students who were asked about age 32, with a difference in average guesses of over 15 years.
  • Students were shown a wheel of fortune game that had numbers on it. The game was rigged to show only the numbers 10 or 65. The students were then asked to estimate the % of African nations in the UN. The average estimates came to 25% and 45%, based on whether they were shown 10 or 65, respectively.
  • A nonprofit requested different amounts of...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2-3: Availability Bias

When trying to answer the question “what do I think about X?,” you actually tend to think about the easier but misleading questions, “what do I remember about X, and how easily do I remember it?” The more easily you remember something, the more significant you perceive what you’re remembering to be. In contrast, things that are hard to remember are lowered in significance.

More quantitatively, when trying to estimate the size of a category or the frequency of an event, you instead use the heuristic: how easily do the instances come to mind? Whatever comes to your mind more easily is weighted as more important or true. This is the availability bias.

This means a few things:

  • Items that are easier to recall take on greater weight than they should.
  • When estimating the size of a category, like “dangerous animals,” if it’s easy to retrieve items for a category, you’ll judge the category to be large.
  • When estimating the frequency of an event, if it’s easy to think of examples, you’ll perceive the event to be more frequent.

In practice, this manifests in a number of ways:

  • Events that trigger stronger emotions (like terrorist attacks) are more readily...

Want to read the rest of this Book Summary?

With Shortform, you can:

Access 1000+ non-fiction book summaries.

Highlight what you want to remember.

Access 1000+ premium article summaries.

Take notes on your favorite ideas.

Read on the go with our iOS and Android App.

Download PDF Summaries.

Sign up for free

Thinking, Fast and Slow Summary Part 2-4: Representativeness

Read the following description of a person.

Tom W. is meek and keeps to himself. He likes soft music and wears glasses. Which profession is Tom W. more likely to be? 1) Librarian. 2) Construction worker.

If you picked librarian without thinking too hard, you used the representativeness heuristic - you matched the description to the stereotype, while ignoring the base rates.

Ideally, you should have examined the base rate of both professions in the male population, then adjusted based on his description. Construction workers outnumber librarians by 10:1 in the US - there are likely more shy construction workers than all librarians!

More generally, the representativeness heuristic describes when we estimate the likelihood of an event by comparing it to an existing prototype in our minds - matching like to like. But just because something is plausible does not make it more probable.

The representativeness heuristic is strong in our minds and hard to overcome. In experiments, even when people receive data about base rates (like about the proportion of construction workers to librarians), people tend to ignore this information, trusting their stereotype...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 2-5: Overcoming the Heuristics

As we’ve been discussing, the general solution to overcoming statistical heuristics is by estimating the base probability, then making adjustments based on new data. Let’s work through an example.

Julie is currently a senior in a state university. She read fluently when she was four years old. What is her grade point average (GPA)?

People often compute this using intensity matching and representativeness, like so:

  • Reading fluently at 4 puts her at, say, the 90th percentile of all kids.
  • The 90th percentile GPA is somewhere around a 3.9.
  • Thus Julie likely has a 3.9 GPA.

Notice how misguided this line of thinking is! People are predicting someone’s academic performance 2 decades later based on how they behaved at 4. System 1 pieces together a coherent story about a smart kid becoming a smart adult.

The proper way to answer questions like these is as follows:

  • Start by estimating the average GPA - this is the base data if you had no information about the student whatsoever. Say this is 3.0.
  • Determine the GPA that matches your impression of the...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 3: Overconfidence | 1: Flaws In Our Understanding

Part 3 explores biases that lead to overconfidence. With all the heuristics and biases described above working against us, when we construct satisfying stories about the world, we vastly overestimate how much we understand about the past, present, and future.

The general principle of the biases has been this: we desire a coherent story of the world. This comforts us in a world that may be largely random. If it’s a good story, you believe it.

Insidiously, the fewer data points you receive, the more coherent the story you can form. You often don’t notice how little information you actually have and don’t wonder about what is missing. You focus on the data you have, and you don’t imagine all the events that failed to happen (the nonevents). You ignore your ignorance.

And even if you’re aware of the biases, you are nowhere near immune to them. Even if you’re told that these biases exist, you often exempt yourself for being smart enough to avoid them.

The ultimate test of an explanation is whether it can predict future events accurately. This is the guideline by which you should assess the merits of your beliefs.

Narrative Fallacy

We desire packaging up a...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 3-2: Formulas Beat Intuitions

Humans have to make decisions from complicated datasets frequently. Doctors make diagnoses, social workers decide if foster parents are good, bank lenders measure business risk, and employers have to hire employees.

Unfortunately, humans are also surprisingly bad at making the right prediction. Universally in all studies, algorithms have beaten or matched humans in making accurate predictions. And even when algorithms match human performance, they still win because algorithms are so much cheaper.

Why are humans so bad? Simply put, humans overcomplicate things.

  • They inappropriately weigh factors that are not predictive of performance (like whether they like the person in an interview).
  • They try too hard to be clever, considering complex combinations of features when simply weighted features are sufficient.
  • Their judgment varies moment to moment without them realizing it. System 1 is very susceptible to influences without the conscious mind realizing. The person’s environment, current mood, state of hunger, and recent exposure to information can all influence decisions. Algorithms don’t feel hunger.
    • As an example, radiologists who read the same...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 3-3: The Objective View

We are often better at analyzing external situations (the “outside view”) than our own. When you look inward at yourself (the “inside view”), it’s too tempting to consider yourself exceptional— “the average rules and statistics don’t apply to me!” And even when you do get statistics, it’s easy to discard them, especially when they conflict with your personal impressions of the truth.

In general, when you have information about an individual case, it’s tempting to believe the case is exceptional, and to disregard statistics of the class to which the case belongs.

Here are examples of situations where people ignore base statistics and hope for the exceptional:

  • 90% of drivers state they’re above average drivers. Here they don’t necessarily think about what “average” means statistically—instead, they think about whether the skill is easy for them, then intensity match to where they fit the population.
  • Most people believe they are superior to most others on most desirable traits.
  • When getting consultations, lawyers may refuse to comment on the projected outcome of a case, saying “every case is unique.”
  • Business owners know that only 35% of new businesses...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 4: Choices | 1: Prospect Theory

Part 4 of Thinking, Fast and Slow departs from cognitive biases and toward Kahneman’s other major work, Prospect Theory. This covers risk aversion and risk seeking, our inaccurate weighting of probabilities, and sunk cost fallacy.

Prior Work on Utility

How do people make decisions in the face of uncertainty? There’s a rich history spanning centuries of scientists and economists studying this question. Each major development in decision theory revealed exceptions that showed the theory’s weaknesses, then led to new, more nuanced theories.

Expected Utility Theory

Traditional “expected utility theory” asserts that people are rational agents that calculate the utility of each situation and make the optimum choice each time.

If you preferred apples to bananas, would you rather have a 10% chance of winning an apple, or 10% chance of winning a banana? Clearly you’d prefer the former.

Similarly, when taking bets, this model assumes that people calculate the expected value and choose the best option.

This is a simple, elegant theory that by and large works and is still taught in intro economics. But it failed to explain the phenomenon of risk aversion, where in...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 4-2: Implications of Prospect Theory

With the foundation of prospect theory in place, we’ll explore a few implications of the model.

Probabilities are Overweighted at the Edges

Consider which is more meaningful to you:

  • Going from 0% chance of winning $1 million to 5% chance
  • Going from 5% chance of winning $1 million to 10% chance

Most likely you felt better about the first than the second. The mere possibility of winning something (that may still be highly unlikely) is overweighted in its importance. (Shortform note: as Jim Carrey’s character said in the film Dumb and Dumber, in response to a woman who gave him a 1 in million shot at being with her: “so you’re telling me there’s a chance!”)

More examples of this effect:

We fantasize about small chances of big gains.

  • Lottery tickets and gambling in general play on this hope.
  • A small sliver of chance to rescue a failing company is given outsized weight.

We obsess about tiny chances of very bad outcomes.

  • The risk of nuclear disasters and natural disasters is overweighted.
  • We worry about our child coming home late at night, though rationally we know there’s little...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 4-3: Variations on a Theme of Prospect Theory

Indifference Curves and the Endowment Effect

Basic theory suggests that people have indifference curves when relating two dimensions, like salary and number of vacation days. Say that you value one day’s salary at about the same as one vacation day.

Theoretically, you should be willing to trade for any other portion of the indifference curve at any time. So when at the end of the year, your boss says you’re getting a raise, and you have the choice of 5 extra days of vacation or a salary raise equivalent to 5 days of salary, you see them as pretty equivalent.

But say you get presented with another scenario. Your boss presents a new compensation package, saying that you can get 5 extra days of vacation per year, but then have to take a cut of salary equivalent to 5 days of pay. How would you feel about this?

Likely, the feeling of loss aversion kicked in. Even though theoretically you were on your indifference curve, exchanging 5 days of pay for 5 vacation days, you didn’t see this as an immediate exchange.

As with prospect theory, the idea of indifference curves ignores the reference point at which you start. In general, people have inertia to change.

They call...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 4-4: Broad Framing and Global Thinking

When you evaluate a decision, you’re prone to focus on the individual decision, rather than the big picture of all decisions of that type. A decision that might make sense in isolation can become very costly when repeated many times.

Consider both decision pairs, then decide what you would choose in each:

Pair 1

1) A certain gain of $240.

2) 25% chance of gaining $1000 and 75% chance of nothing.

Pair 2

3) A certain loss of $750.

4) 75% chance of losing $1000 and 25% chance of losing nothing.

As we know already, you likely gravitated to Option 1 and Option 4.

But let’s actually combine those two options, and weigh against the other.

1+4: 75% chance of losing $760 and 25% chance of gaining $240

2+3: 75% chance of losing $750 and 25% chance of gaining $250

Even without calculating these out, 2+3 is clearly superior to 1+4. You have the same chance of losing less money, and the same chance of gaining more money. Yet you didn’t think to combine all unique pairings and combine them with each other!

This is the difference between narrow framing and broad framing. The ideal broad framing is to consider every combination of options to find the...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Part 5-1: The Two Selves of Happiness

Part 5 of Thinking, Fast and Slow departs from cognitive biases and mistakes and covers the nature of happiness.

(Shortform note: compared to the previous sections, the concepts in this final portion are more of Kahneman’s recent research interests and are more a work in progress. Therefore, they tend to have less experimental evidence and less finality in their conclusions.)

Happiness is a tricky concept. There is in-the-moment happiness, and there is overall well being. There is happiness we experience, and happiness we remember.

Consider having to get a number of painful shots a day. There is no habituation, so each shot is as painful as the last. Which one represents a more meaningful change?

  • Decreasing from 20 shots to 18 shots
  • Decreasing from 6 shots to 4 shots

You likely thought the latter was far more meaningful, especially since it drives more closely toward zero pain. But Kahneman found this incomprehensible. Two shots is two shots! There is a quantum of pain that is being removed, and the two choices should be evaluated as much closer.

In Kahneman’s view, someone who pays different amounts for the same gain of experienced utility is making a...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →

Thinking, Fast and Slow Summary Part 5-2: Experienced Well-Being vs Life Evaluations

Measuring Experienced Well-Being

How do you measure well-being? The traditional survey question reads: “All things considered, how satisfied are you with your life as a whole these days?”

Kahneman was suspicious that the remembering self would dominate the question, and that people were terrible at “considering all things.” The question tends to trigger the one thing that gives immense pleasurable (like dating a new person) or pain (like an argument with a co-worker).

To measure experienced well-being, he led a team to develop the Day Reconstruction Method, which prompts people to relive the day in detailed episodes, then to rate the feelings. Following the philosophy of happiness being the “area under the curve,” they conceived of the metric U-index: the percentage of time an individual spends in an unpleasant state.

They reported these findings:

  • There was large inequality in the distribution of pain. 50% of people reported going through a day without an unpleasant episode. But a minority experience considerable emotional distress for much of the day, for instance from illness, misfortune, or personal disposition.
  • Different activities have different...

Try Shortform for free

Read full summary of Thinking, Fast and Slow

Sign up for free

Thinking, Fast and Slow Summary Shortform Exclusive: Checklist of Antidotes

As an easy reference, here’s a checklist of antidotes covering every major bias and heuristic from the book.

Cognitive Biases and Heuristics

  • To block System 1 errors, recognize the signs that you’re in trouble and ask System 2 for reinforcement.
  • Observing errors in others is easier than in yourself. So ask others for review. In this way, organizations can be better than individuals at decision-making.
  • To better regulate your behavior, make critical choices in times of low duress so that System 2 is not taxed.
    • Order food in the morning, not when you’re tired after work or struggling to meet a deadline.
    • Notice when you’re likely to be in times of high duress, and put off big decisions to later. Don’t make big decisions when nervous about others watching.
  • In general, when estimating probability, begin with the baseline probability. Then adjust from this rate based on new data. Do NOT start with your independent guess of probability, since you ignore the data you don’t have.
  • WYSIATI
    • Force yourself to ask: “what evidence am I missing? What evidence would make me change my mind?”
  • Ordering effect
    • Before having a public...

What Our Readers Say

This is the best summary of How to Win Friends and Influence People I've ever read. The way you explained the ideas and connected them to other books was amazing.
Learn more about our summaries →