Mental Models: What They Are & How to Apply Them

What are mental models? What are some ways mental models can be useful in your everyday life?

Mental models are recurring concepts and patterns that explain a wide array of situations. For example, physicists might use Newton’s law of inertia to predict that planets in motion will remain in motion, and biologists might rely on natural selection to explain the origin of the fight-or-flight response.

In this article, you’ll learn about the concept of mental models and how they can be useful in everyday life.

What Are Mental Models?

What are mental models? In short, mental models are concepts and patterns that help us understand various situations across distinct topics. For instance, the mental model of supply and demand dictates that the relation between supply and demand determines prices: If supply exceeds demand, the price decreases, and if demand exceeds supply, the price increases.

(Shortform note: In The Great Mental Models Volume 1, Shane Parrish and Rhiannon Beaubien describe mental models as representations of how things work. Weinberg and McCann, however, go further, using the term “mental models” to describe a wide array of concepts beyond mere representations of how things work. For example, they even refer to generic phenomena such as multitasking and organizational culture as mental models. For clarity, we’ve italicized the mental models included in this guide.)

To understand the general advantages of mental models, we’ll discuss mental models that help us avoid common pitfalls in decision-making. This discussion is two-fold: First, we’ll examine models that help us avoid shoddy reasoning, and second, we’ll examine models that help us avoid unintended consequences. By applying these models, we’ll make better-informed decisions whose consequences we understand.

Pitfall 1: Shoddy Reasoning

Weinberg and McCann assert that, when reasoning, we naturally defer to conventional thinking and our intuition, which is our ability to reason subconsciously. However, conventional thinking and intuition are shaped by inflexible assumptions, which means they can be rigid. For example, the practice of bloodletting–withdrawing someone’s blood for medicinal purposes–was part of conventional medical practice because it fit neatly with Humorism, the theory that we’re composed of four humors (blood, phlegm, black bile, and yellow bile). Since Humorism was an entrenched assumption, it led to a rigid belief in the efficacy of bloodletting for roughly 3,000 years until the practice was largely discredited in the late 1800s.

In light of this rigidity, conventional thinking and intuition can mislead us in situations where they’re inappropriate. For instance, in the case of bloodletting, the conventional assumptions of Humorism misled physicians into harming their patients. In this section, we’ll examine such situations so we know when to avoid this type of reasoning. 

Inappropriate Intuition

To determine whether intuition is inappropriate, Weinberg and McCann use Daniel Kahneman’s model of fast and slow thinking, outlined in Thinking, Fast and Slow. Kahneman distinguishes between fast thinking, where our mind operates quickly and involuntarily (as in doing basic math), and slow thinking, where we reason carefully and deliberately (as in doing calculus). 

They argue that we should avoid leaning on intuition in situations suited for slow thinking. For instance, imagine that you’re an American visiting a less-talkative culture, and none of the locals are willing to talk with you. If you rely on intuition, you might incorrectly conclude they’re impolite or rude, when there are simply different norms at play. In this case, use slow thinking to consider these unfamiliar norms instead of making a snap judgment.

Reason From First Principles

In situations that call for slow thinking, a more helpful mental model is reasoning from first principles. McCann and Weinberg explain that first principles are self-evident assumptions that ground your reasoning. For example, in deciding which career path to pursue, you might follow the first principle that upward mobility is valuable.

First principles provide a sturdy foundation for our beliefs, helping us avoid the shortcomings of conventional thinking. Because the assumptions underlying conventional thinking are often misguided, using first principles can help us avoid them. For example, conventional astronomy once placed the Earth at the center of the solar system. However, Copernicus rejected this conventional view because he accepted the first principle that the most logical mathematical model of stars’ movements was correct. Since the heliocentric model was the best mathematical model of the stars, Copernicus accepted it instead. 

De-Risk Assumptions

Still, even seemingly self-evident assumptions can be mistaken. To avoid false assumptions infiltrating your reasoning, Weinberg and McCann recommend another mental model: De-risk your assumptions by testing their validity with objective measures.

This de-risking process can look different depending on the assumption. For instance, a conservative political candidate might assume they’ll comfortably win historically Republican districts. In this case, de-risking might involve conducting a poll to test this assumption.

Account for Your Frame of Reference

False assumptions aren’t only a risk when reasoning from first principles—we’re also susceptible to false assumptions rooted in our own perspective. 

For example, consider a model from Einstein’s theory of relativity, the frame of reference. In physics, an object’s frame of reference is roughly its location relative to which its speed and direction can be measured. To oversimplify, we could say that relative to our location inside a moving airplane, we’re currently stationary, though relative to the location of the airport, we’re currently traveling north at 550 mph.

We all have our own personal frames of reference, our subjective experience through which we perceive the world. To reason objectively, we need to keep in mind our own frame of reference and the biases afflicting it. Weinberg and McCann caution that one factor impacting our frame of reference is availability bias, where we overemphasize recently acquired information. For instance, a news report about recent shark attacks might make us think they’re a major issue, when they actually only cause about one fatality every other year.

Fundamental Attribution Error

Another issue caused by our frame of reference is the fundamental attribution error, where we misattribute the motivations of others to character or personality, rather than external factors. For example, if we thought our server was being cold, we might conclude that they’re rude and impolite, not that they’ve just had a bad day. 

To combat this error, Weinberg and McCann encourage seeking out the most respectful interpretation of others’ behaviors. Essentially, this involves viewing others’ actions charitably, rather than cynically. This advice squares well with Hanlon’s Razor, a mental model stating that we shouldn’t attribute someone’s behavior to malice if we can instead attribute it to carelessness.

(Shortform note: Hanlon’s Razor is only meant to be a general heuristic. As such, it can lead us astray in situations where malice is a better explanation than carelessness, even if both offer possible explanations. In Pride and Prejudice, Jane Bennett exemplifies the danger of misusing Hanlon’s Razor: When Caroline Bingley repeatedly tries to sabotage Jane’s budding relationship with her brother, Jane repeatedly—and implausibly—gives Caroline the benefit of the doubt.)

Pitfall 2: Undesired Consequences

Now that we’ve seen how to avoid shoddy reasoning, we’ll learn how to avoid harmful consequences by improving our predictions about the future. To do so, we’ll learn several mental models that illuminate the inadvertent consequences of our actions.

Inadvertently Harming Your Neighbor

The first unintended outcome Weinberg and McCann discuss results from the tyranny of small decisions, where individually reasonable decisions collectively create a worse outcome for everyone. For instance, you might think your vote doesn’t matter and therefore refrain from voting. While this seems reasonable, democracy would crumble if everyone followed suit.

The tyranny of small decisions is especially prevalent regarding public goods, since it’s tempting to think that it won’t matter if we take more than our rightful share. However, public goods can be jeopardized if everyone follows suit, illustrating another mental model: the tragedy of the commons. For instance, during the onset of the Covid-19 pandemic, people flocked to stores to buy excessive amounts of groceries. This created  shortages, leaving many without access to vital items. 

Kant’s Antidote to the Tyranny of Small Decisions

In his Groundwork for the Metaphysics of Morals, Immanuel Kant recognized that decisions that seem individually reasonable can lead to collective catastrophe if everyone makes them. Consequently, he argued that for an individual decision to be rational, it must be universalizable: you can rationally wish that anyone in the same context could make the same decision. So, for example, lying is normally irrational, because the trust that underlies our communication would be destroyed if everyone constantly lied.

Accordingly, consulting Kant’s universalizability maxim can help prevent the tyranny of small decisions and the tragedy of the commons. If we ask, “Would it be rational to want everyone to make the decision I’m contemplating?” we can better determine which decisions might lead to collective disaster. In practice, this can help us understand why we should normally vote, tell the truth, and conserve public goods, to list a few.

Failure to Think Long-Term

In a similar vein, some decisions yield benefits in the short term while leading to long-term disaster. To illustrate, consider the mental model based on the boiling frog, which stays in a pot of water as the temperature gradually increases, eventually finding itself boiled alive. Although the increase in temperature feels nice in the short term, it creates a long-term catastrophe.

Weinberg and McCann argue that we’re susceptible to such decisions because of short-termism. In finance, this refers to emphasizing short-term results to the detriment of long-term results. For instance, a company that only focuses on marketing current products might fail to develop new products necessary for future success. Weinberg and McCann suggest that short-termism afflicts our general decision-making as well.

To combat short-termism, heed the precautionary principle and act with extreme caution when an action’s potentially harmful consequences are unknown. More concretely, this involves analyzing potential risk by considering the harmful consequences that an action could lead to. Ask yourself, “Is there reason to think that this action has dangerous consequences?” If the answer is “yes,” pause before acting. By adhering to this principle, Weinberg and McCann suggest we’re less likely to run into harmful consequences down the road. 

Mental Models: What They Are & How to Apply Them

Darya Sinusoid

Darya’s love for reading started with fantasy novels (The LOTR trilogy is still her all-time-favorite). Growing up, however, she found herself transitioning to non-fiction, psychological, and self-help books. She has a degree in Psychology and a deep passion for the subject. She likes reading research-informed books that distill the workings of the human brain/mind/consciousness and thinking of ways to apply the insights to her own life. Some of her favorites include Thinking, Fast and Slow, How We Decide, and The Wisdom of the Enneagram.

Leave a Reply

Your email address will not be published.