System 1 and System 2 Thinking: Use Both to Make the Best Decisions

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here.

What are System 1 and System 2 thinking? How do they work in concert, and when should you rely on one or the other?

System 1 and System 2 thinking are two systems of thinking defined by Daniel Kahneman in Thinking, Fast and Slow. Generally, System 1 thinking is fast and System 2 thinking is slow.

We’ll cover how System 1 and System 2 thinking work together and when you should use one or the other.

System 1 and System 2 Thinking

Daniel Kahneman defines two systems of the mind: System 1 and System 2 thinking.

System 1 Thinking: operates automatically and quickly, with little or no effort, and no sense of voluntary control

  • Examples: Detect that one object is farther than another; detect sadness in a voice; read words on billboards; understand simple sentences; drive a car on an empty road.

System 2 Thinking: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration

  • Examples: Focus attention on a particular person in a crowd; exercise faster than is normal for you; monitor your behavior in a social situation; park in a narrow space; multiple 17 x 24.

System 1 and System 2 Thinking: How They Work Together

System 1 automatically generates suggestions, feelings, and intuitions for System 2 thinking. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions. 

System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on.

A lazy System 2 accepts what the faulty System 1 gives it, without questioning. This leads to cognitive biases. Even worse, cognitive strain taxes System 2 thinking, making it more willing to accept System 1. Therefore, we’re more vulnerable to cognitive biases when we’re stressed.

Because Daniel Kahneman’s System 1 operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 thinking is too slow to substitute in routine decisions. We should aim for a compromise: recognize situations when we’re vulnerable to mistakes, and avoid large mistakes when the stakes are high.

Properties of Daniel Kahneman’s System 1 and System 2 Thinking

System 1 can arise from expert intuition, trained over many hours of learning. In this way a chess master can recognize a strong move within a second, where it would take a novice several minutes of System 2 thinking.

System 2 thinking requires attention and is disrupted when attention is drawn away. More on this next.

System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2 thinking, intuitions turn into beliefs, and impulses turn into voluntary actions. 

System 1 can detect errors and recruits System 2 for additional firepower. 

  • Kahneman tells a story of a veteran firefighter who entered a burning house with his crew, felt something was wrong, and called for them to get out. The house collapsed shortly after. He only later realized that his ears were unusually hot but the fire was unusually quiet, indicating the fire was in the basement.

In summary, most of what you consciously think and do originates in System 1, but System 2 takes over when the situation gets difficult. System 1 normally has the last word.

System 1 and System 2 Thinking: Errors in One and Laziness in the Other

Consider these questions, and go through them quickly, trusting your intuition. 

1) A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

2) How many murders happen in Michigan each year?

3) Does the conclusion from the premises?

  • All roses are flowers.
  • Some flowers fade quickly.
  • Therefore, some roses fade quickly.

Ready to see the answers?

1) The answer is $0.05. The common intuitive (and wrong) answer is $0.10.

2) The trick is whether you remember that Detroit is in Michigan. People who remember this estimate a number that is much higher (and more accurate) than those who forget.

3) The answer is no—all roses may not fit into the subcategory of flowers that fade quickly. 

===

All of your answers, if you really spent time on it, could be verified by deliberate System 2 thinking. In the first question, it’s easy to see that if the ball cost $0.10, the total would be $1.20, which is clearly incompatible with the question. For the second question, if you had to enumerate the major cities of Michigan, you would likely list Detroit. 

For some people, spending enough time would be sufficient to get the answers right. But many people, even if given unlimited time, might not even think to apply their System 2 thinking to question their answers and find different approaches to the question. Over 50% of students at Harvard and MIT gave the wrong answer to the bat-and-ball question; over 80% at less selective universities.

This is the insidious problem of a “lazy System 2.” System 1 surfaces the intuitive answer for System 2 thinking to evaluate. But a lazy System 2 doesn’t properly do its job – it accepts what System 1 offers without expending the small investment of effort that could have rejected the wrong answer.

Even worse, this aggravates confirmation bias. A piece of information that fits your prior beliefs might evoke a positive System 1 feeling, while your System 2 might never pause to evaluate the validity of the piece of information. If you believe a conclusion is true, you might believe arguments that support it, even when the arguments are unsound.

It’s useful then to distinguish between intelligence and rationality. 

  • Intelligence might be considered the full computational horsepower of a person’s brain. 
  • Rationality is resistance to mental laziness; not accepting a superficially plausible answer; being more skeptical of intuitions; tending to put in the hard work of checking the logic; and thus immunity to biases. 

In other words, a powerful system 2 is useless if the person doesn’t recognize the need to override their system 1 response.

The theme here, that will recur through the book, is that people are overconfident and place too much faith in their intuitions. Further, they find cognitive effort unpleasant and avoid it as much as possible.

While System 1 is useful, you should utilize both Daniel Kahneman’s System 1 and System 2 thinking.

System 1 and System 2 Thinking: Use Both to Make the Best Decisions

———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform. Learn the book's critical concepts in 20 minutes or less.

Here's what you'll find in our full Thinking, Fast and Slow summary:

  • Why we get easily fooled when we're stressed and preoccupied
  • Why we tend to overestimate the likelihood of good things happening (like the lottery)
  • How to protect yourself from making bad decisions and from scam artists
Amanda Penn

Amanda Penn

Amanda Penn is a writer and reading specialist. She’s published dozens of articles and book reviews spanning a wide range of topics, including health, relationships, psychology, science, and much more. Amanda was a Fulbright Scholar and has taught in schools in the US and South Africa. Amanda received her Master's Degree in Education from the University of Pennsylvania.

Leave a Reply

Your email address will not be published. Required fields are marked *