System 1 Thinking: How It Works (And When You Shouldn’t Trust It)

This article is an excerpt from the Shortform summary of "Thinking, Fast and Slow" by Daniel Kahneman. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here .

What is “System 1 Thinking,” from Daniel Kahneman’s Thinking, Fast and Slow? When should I use it, and when shouldn’t I?

System 1 thinking is thinking that operates automatically and quickly. It takes little or no effort, and no sense of voluntary control.

We’ll cover how Kahneman’s System 1 thinking is involved in making judgments and what biases System 1 thinking leaves you susceptible to.

Two Systems of Thinking

We believe we’re being rational most of the time, but really much of our thinking is automatic, done subconsciously by instinct. Most impressions arise without your knowing how they got there. Can you pinpoint exactly how you knew a man was angry from his facial expression, or how you could tell that one object was farther away than another, or why you laughed at a funny joke?

This becomes more practically important for the decisions we make. Often, we’ve decided what we’re going to do before we even realize it. Only after this subconscious decision does our rational mind try to justify it.

The brain does this to save on effort, substituting easier questions for harder questions. Instead of thinking, “should I invest in Tesla stock? Is it priced correctly?” you might instead think, “do I like Tesla cars?” The insidious part is, you often don’t notice the substitution. This type of substitution produces systematic errors, also called biases. We are blind to our blindness.

System 1 and System 2 Thinking

In Thinking, Fast and Slow, Kahneman defines two systems of the mind:

System 1 thinking: operates automatically and quickly, with little or no effort, and no sense of voluntary control

  • System 1 Thinking Examples: Detect that one object is farther than another; detect sadness in a voice; read words on billboards; understand simple sentences; drive a car on an empty road.

System 2 thinking: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration

  • System 2 Thinking Examples: Focus attention on a particular person in a crowd; exercise faster than is normal for you; monitor your behavior in a social situation; park in a narrow space; multiple 17 x 24.

Properties of System 1 Thinking

Kahneman’s System 1 thinking can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on. 

System 1 thinking can arise from expert intuition, trained over many hours of learning. In this way a chess master can recognize a strong move within a second, where it would take a novice several minutes of System 2 thinking.

System 1 thinking automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions. 

System 1 thinking can detect errors and recruits System 2 for additional firepower. 

  • Kahneman tells a story of a veteran firefighter who entered a burning house with his crew, felt something was wrong, and called for them to get out. The house collapsed shortly after. He only later realized that his ears were unusually hot but the fire was unusually quiet, indicating the fire was in the basement.

Because System 1 thinking operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 is too slow to substitute in routine decisions. “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

In summary, most of what you consciously think and do originates in Kahneman’s System 1 thinking, but System 2 takes over when the situation gets difficult. System 1 normally has the last word.

System 1 Thinking: How We Make Judgments

System 1 thinking continuously monitors what’s going on outside and inside the mind and generates assessments with little effort and without intention. The basic assessments include language, facial recognition, social hierarchy, similarity, causality, associations, and exemplars. 

  • In this way, you can look at a male face and consider him competent (for instance, if he has a strong chin and a slight confident smile).
  • The survival purpose is to monitor surroundings for threats.

However, not every attribute of the situation is measured. System 1 thinking is much better at determining comparisons between things and the average of things, not the sum of things. Here’s an example:

In the below picture, try to quickly determine what the average length of the lines is. 

Now try to determine the sum of the length of the lines. This is less intuitive and requires System 2.

Unlike System 2 thinking, these basic assessments of System 1 thinking are not impaired when the observer is cognitively busy.

In addition to basic assessments: Kahneman’s System 1 thinking also has two other characteristics:

1) Translating Values Across Dimensions, or Intensity Matching

System 1 thinking is good at comparing values on two entirely different scales. Here’s an example.

Consider a minor league baseball player. Compared to the rest of the population, how athletic is this player? 

Now compare your judgment to a different scale: If you had to convert how athletic the player is into a year-round weather temperature, what temperature would you choose?

Just as a minor league player is above average but not the top tier, the temperature you chose might be something like 80 Fahrenheit.

As another example, consider comparing crimes and punishments, each expressed as musical volume. If a soft-sounding crime is followed by a piercingly loud punishment, then this means a large mismatch that might indicate injustice.

2) Mental Shotgun

Kahneman’s System 1 thinking often carries out more computations than are needed. Kahneman calls this “mental shotgun.”

For example, consider whether each of the following three statements is literally true:

  • Some roads are snakes.
  • Some jobs are snakes.
  • Some jobs are jails.

All three statements are literally false. The second statement likely registered more quickly as false to you, while the other two took more time to think about because they are metaphorically true. But even though finding metaphors was irrelevant to the task, you couldn’t help noticing them – and so the mental shotgun slowed you down. Your System-1 brain made more calculations than it had to.

Biases of System 1 Thinking

Putting it all together, we are most vulnerable to biases when:

  • System 1 thinking forms a narrative that conveniently connects the dots and doesn’t express surprise.
  • Because of the cognitive ease by System 1 thinking, System 2 is not invoked to question the data. It merely accepts the conclusions of System 1.

In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.

In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in a weather emergency.

Here’s a collection of Kahneman’s System 1 thinking biases.

System 1 Thinking Bias #1: Ordering Effect

First impressions matter. They form the “trunk of the tree” to which later impressions are attached like branches. It takes a lot of work to reorder the impressions to form a new trunk.

Consider two people who are described as follows:

  • Amos: intelligent, hard-working, strategic, suspicious, selfish
  • Barry: selfish, suspicious, strategic, hard-working, intelligent

Most likely you viewed Amos as the more likable person, even though the five words used are identical, just differently ordered. The initial traits change your interpretation of the traits that appear later.

This explains a number of effects:

  • Pygmalion effect: A person’s expectation of a target person affects the target person’s performance. Have higher expectations of a person, and they will tend to do better. 
    • In an experiment, students were randomly ordered in a report of academic performance. This report was then given to teachers. Students who were randomly rated as more competent ended the year with better academic scores, even though they started the school year with no average difference.
  • Kahneman previously graded exams by going through an entire student’s test before going to the next student’s. He found that the student’s first essay dramatically influenced his interpretation of later essays – an excellent first essay would earn the student benefit of the doubt on a poor second essay. A poor first essay would cast doubt on later effective essays. He subverted this by batching by essay and iterating through all students.
  • Work meetings often polarize around the first and most vocal people to speak. Meetings would better yield the best ideas if people could write down opinions beforehand.
  • Witnesses are not allowed to discuss events in a trial before testimony. 

The antidote to the ordering effect:

  • Before having a public discussion on a topic, elicit opinions from the group confidentially first. This avoids bias in favor of the first speakers.

System 1 Thinking Bias #2: Mere Exposure Effect

Exposing someone to an input repeatedly makes them like it more. Having a memory of a word, phrase, or idea makes it easier to see again.

System 1 Thinking Bias #3: Narrative Fallacy

This is explained more in Part 2, but it deals with System 1 thinking. 

People want to believe a story and will seek cause-and-effect explanations in times of uncertainty. This helps explain the following:

  • Stock market movements are explained like horoscopes, where the same explanation can be used to justify both rises and drops (for instance, the capture of Saddam Hussein was used to explain both the rise and subsequent fall of bond prices).
  • Most religions explain the creation of earth, of humans, and of the afterlife.
  • Famous people are given origin stories – Steve Jobs reached his success because of his abandonment by his birth parents. Sports stars who lose a championship have the loss attributed to a host of reasons. 

Once a story is established, it becomes difficult to overwrite. (Shortform note: this helps explain why frauds like Theranos and Enron were allowed to perpetuate – observers believed the story they wanted to hear.)

System 1 Thinking Bias #4: Affect Heuristic

How you like or dislike something determines your beliefs about the world.

For example, say you’re making a decision with two options. If you like one particular option, you’ll believe the benefits are better and the costs/risks more manageable than those of alternatives. The inverse is true of options you dislike.

Interestingly, if you get a new piece of information about an option’s benefits, you will also decrease your assessment of the risks, even though you haven’t gotten any new information about the risks. You just feel better about the option, which makes you downplay the risks.

Vulnerability to Bias

We’re more vulnerable to biases when System 2 is taxed.

To explain this, psychologist Daniel Gilbert has a model of how we come to believe ideas:

  • System 1 thinking constructs the best possible interpretation of the belief – if the idea were true, what does it mean?
  • System 2 evaluates whether to believe the idea – “unbelieving” false ideas.

When System 2 is taxed, then it does not attack System 1 thinking’s belief with as much scrutiny. Thus, we’re more likely to accept what it says.

Experiments show that when System 2 is taxed (like when forced to hold digits in memory), you become more susceptible to false sentences. You’ll believe almost anything.

This might explain why infomercials are effective late at night. It may also explain why societies in turmoil might apply less logical thinking to persuasive arguments, such as Germany during Hitler’s rise.

System 1 Thinking: How It Works (And When You Shouldn’t Trust It)

———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Thinking, Fast and Slow" at Shortform . Learn the book's critical concepts in 20 minutes or less .

Here's what you'll find in our full Thinking, Fast and Slow summary :

  • Why we get easily fooled when we're stressed and preoccupied
  • Why we tend to overestimate the likelihood of good things happening (like the lottery)
  • How to protect yourself from making bad decisions and from scam artists

Amanda Penn

Amanda Penn is a writer and reading specialist. She’s published dozens of articles and book reviews spanning a wide range of topics, including health, relationships, psychology, science, and much more. Amanda was a Fulbright Scholar and has taught in schools in the US and South Africa. Amanda received her Master's Degree in Education from the University of Pennsylvania.

Leave a Reply

Your email address will not be published.