Making Bad Decisions: When Your Logic Turns Against You

This article gives you a glimpse of what you can learn with Shortform. Shortform has the world’s best guides to 1000+ nonfiction books, plus other resources to help you accelerate your learning.

Want to learn faster and get smarter? Sign up for a free trial here .

Why do we make bad decisions? More importantly, how can we learn from our mistakes to make better decisions in the future?

Our lives are full of big decisions, yet our minds are wired to make bad ones. Sometimes our decisions turn against us because of bad luck. Most of the time, however, our biases and illogical ways of thinking steer us toward making bad decisions. 

Here’s a look at the most common logical fallacies that affect our ability to make good decisions. 

Neglect of Probability

One common reason people make bad decisions is that they neglect to consider the probability or risk involved in their decisions. According to Rolf Dobelli, the author of The Art of Thinking Clearly, people tend to choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur. 

For example, people are more likely to invest in a risky stock that could have a huge return on investment than a safe stock that has lower returns. 

In situations like this, the most logical decision is to choose the option with the highest probability of going well and the lowest risk of going badly. 

Denominator Neglect

A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating on top and the total number of possibilities on bottom, people base their judgment of risk solely on the top number, or numerator, and ignore the bottom number, or denominator. This means they regularly misinterpret probability, since the numerator depends on the denominator.

For example, if you only considered the numerators when comparing a 1/100 chance of injury and a 100/10,000 chance of injury, you’d misinterpret the second scenario as much more likely to occur, even though the probability of injury is actually the same.

TITLE: The Art of Thinking Clearly
AUTHOR: Rolf Dobelli
TIME: 61
READS: 69.8
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2022/02/the-art-of-thinking-clearly-cover.png
BOOK_SUMMARYURL: the-art-of-thinking-clearly-summary-rolf-dobelli
AMZN_ID: XYZ

The Availability Heuristic

The availability heuristic is the tendency to over-inflate the value of information that is recently available and ignore other important data that would have helped in making a better decision. 

For example, imagine a parent who is watching the local news a week before Halloween. He sees a report describing people slipping illegal substances into Halloween candy. He decides that, this year, he will check every piece of his child’s Halloween candy. Even though it’s extremely uncommon, the parent thinks it is likely that his child’s candy will be tampered with, because the news report he saw sticks so vividly in his mind. 

The availability heuristic steers us toward making bad decisions, because it can cause us to give importance to experiences or pieces of information that don’t really matter. Imagine you’re planning a trip, and you’re deciding between renting a house in two beach towns. You’ve researched online reviews from vacationers in both towns, and Town #1 has overwhelmingly positive reviews, while Town #2 has overwhelmingly negative reviews. You think you’ve made up your mind, but a friend tells you she took a trip to Town #2 and had a great experience. You then book a trip to Town #2 because her advice was more vivid to you than the negative online reviews. Unfortunately, the reviews were right, and you have a bad experience in Town #2.

To counteract the availability heuristic, Barry Schwartz, the author of The Paradox of Choice, recommends collecting information from a variety of sources and people. Getting multiple opinions, many of which might be memorable, can help you make informed decisions even with this cognitive bias. 

TITLE: The Paradox of Choice
AUTHOR: Barry Schwartz
TIME: 68
READS: 59.6
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2022/01/the-paradox-of-choice-cover.png
BOOK_SUMMARYURL: the-paradox-of-choice-summary-barry-schwartz
AMZN_ID: XYZ

The Sunk Cost Fallacy 

The sunk cost fallacy steers us toward making bad decisions in situations where we’ve invested considerable energy and resources into a project or an endeavor, even when the sunk costs outweigh the potential benefits.  

For example, we sit through a movie we hate because we paid for it or we keep waiting when our doctor or lawyer is late rather than rescheduling the appointment. Similarly, people stay in bad relationships or continue gambling to win their money back. 

It’s human nature to respond this way—the more you invest (sunk costs), the harder it is to pull the plug on a commitment. But sometimes the best decision is to cut your losses. In his book Essentialism, Greg McKeown highlights common sunk cost fallacy traps to watch out for: 

The ‘endowment effect.’ This is our tendency to overvalue things we own and place less value on things that don’t belong to us. For instance, when you rent a car, you don’t wash it before returning it because you don’t value it like you do your own car. We have this same tendency when it comes to nonessential activities, as well as belongings. When we feel we own an activity, it becomes harder to uncommit from it — for instance, your commitment to volunteer at a bake sale seems hard to get out of when you’re the one who organized the event. 

  • Tip: for an activity, ask how hard you’d try to get in on it if you weren’t already involved. For a possession, ask how much you’d pay for it if you didn’t own it. For an opportunity, ask, if you didn’t have the opportunity, how much would you sacrifice to get it. If the answer is “not much,” you know how much you truly value that possession or opportunity.

Fear of wasting something. The main reason the sunk cost effect persists in spite of the evidence that the project is failing is because we have an aversion to wasting time, money, or anything that might have value somewhere. We’ve been trained this way since childhood. Abandoning a losing project feels like you’ve wasted your investment. But if you don’t let go of a bad choice, you doom yourself to wasting even more.

Reluctance to admit mistakes. We have to admit mistakes before we can move on. By remaining in denial we continue to circle pointlessly, like people (before Google) who couldn’t stop and ask for directions because they couldn’t admit to being lost. (This is the epitome of a nonessential activity.) 

  • Tip: you shouldn’t be ashamed of admitting a mistake because you’re acknowledging that you’re smarter now.

Forcing something to work. When you’ve fallen prey to the sunk cost effect, it’s tempting to keep trying to force the project to work, which of course is a waste of effort. 

  • Tip: to break the pattern, get a neutral second opinion. An objective observer’s opinion that it’s a lost cause can make us feel better about giving it up.

Not questioning the status quo. We have a tendency (called the status quo bias) to keep doing something because we’ve always done it. For instance, companies keep using old systems that have long outlived their effectiveness without questioning them. 

  • Tip: one solution is to apply the technique of zero-based accounting (starting from scratch rather than working from last year’s budget). Apply the technique in your life this way —  don’t base your time on existing commitments; assume they don’t exist and ask which ones you should include today. From scratch, justify everything you do.

Making casual commitments. Don’t make spur-of-the-moment commitments, like when a friend mentions a new restaurant she wants to try and you say you’ll go with her. It’s easy to fill your calendar with commitments added on the fly while chatting with someone. 

  • Tip: pause a few seconds before answering a request to give yourself time to consider whether it’s essential. If you have second thoughts about a commitment you’ve made, it’s better to apologize and back out than to do something nonessential.

TITLE: Essentialism
AUTHOR: Greg McKeown
TIME: 33
READS: 36.2
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2020/01/essentialism-cover.jpg
BOOK_SUMMARYURL: essentialism-book-summary-greg-mckeown
AMZN_ID: B00G1J1D28

Loss Aversion Bias

Loss aversion is a cognitive bias whereby individuals would rather avoid losses than acquire gains. We feel losses more strongly than gains. Losing $10 causes a stronger reaction than gaining $10 does.

According to Charlie Munger, the author of Poor Charlie’s Almanack, the loss aversion bias synergizes strongly with the sunk cost bias. The greater the investment in your venture (sunk cost), the greater the unwillingness to make changes will be. 

For example, your business may be doomed and require heavy changes that would rescue it, such as upending people’s livelihoods or depriving yourself of a way of life. This causes a strong deprival superreaction as well as a tendency to keep inertia in what you’re doing, so you end up not making the necessary changes.

To avoid making bad decisions due to loss aversion, calibrate your losses in true absolute terms, rather than relative terms. For instance, losing a $100 bill from your wallet is far less impactful than a bad habit that costs $2,000 a year or a 1% loss on a large investment.

TITLE: Poor Charlie's Almanack
AUTHOR: Charles T. Munger
TIME: 58
READS: 172.5
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2021/05/poor-charlies-almanack-cover.png
BOOK_SUMMARYURL: poor-charlies-almanack-summary-charles-t-munger
AMZN_ID: XYZ

In Skin in the Game, Nassim Taleb argues that loss aversion is not necessarily a bad thing. He argues that this mental bias helps us survive. He takes into account one factor that, he claims, invalidates the conclusions of most social scientists who attempt to deal with probability: the effect of “ruin.”“Ruin” is a state of loss that you can’t come back from. If a business suffers enough losses that it’s forced to shut down, it’s ruined. Even if its profits would skyrocket in the next quarter, it doesn’t matter. The business has suffered a permanent loss.

Opportunities for ruin are all around us, yet they’re largely ignored when academics try to analyze risk. It’s impossible to mathematically calculate risk versus reward when ruin is a potential outcome because no benefit could outweigh the finality of ruin.Taleb uses the example of Russian Roulette. Imagine someone offers you one million dollars to load one bullet into a six-chamber pistol, spin the cylinder, and fire at your own head. Traditional cost-benefit analysis would conclude that, on average, you can expect to make $833,333. That doesn’t sound too bad! In reality, however, almost no one would take this deal. Cost-benefit analysis leads to invalid conclusions if you’re risking a permanent loss.

With this in mind, the human instinct to overcompensate and avoid risks appears to be more rational. Even if there’s only a tiny chance of total ruin, it’s worth it to take precautions against it. For example, only about 5% of home insurance owners ever successfully file a claim—yet around 85% of homeowners insure their homes. On average, you’d come out ahead financially if you refuse to get home insurance, but it’s worth it to hedge your bets against ruin.

Narrow Framing

When you evaluate a decision, you’re prone to focus on the individual decision, rather than the big picture of all decisions of that type. As a result, we’re prone to making bad decisions because a decision that might make sense in isolation can become very costly when repeated many times.

Consider both decision pairs, then decide what you would choose in each:

Pair 1:

 1) A certain gain of $240.

 2) 25% chance of gaining $1000 and 75% chance of nothing.

Pair 2:

3) A certain loss of $750.

4) 75% chance of losing $1000 and 25% chance of losing nothing.

As we know already, you likely gravitated to Option 1 and Option 4. But let’s actually combine those two options, and weigh against the other: 

1+4: 75% chance of losing $760 and 25% chance of gaining $240

2+3: 75% chance of losing $750 and 25% chance of gaining $250

Even without calculating these out, 2+3 is clearly superior to 1+4. You have the same chance of losing less money, and the same chance of gaining more money. Yet you didn’t think to combine all unique pairings and combine them with each other!

This is the difference between narrow framing and broad framing. The ideal broad framing is to consider every combination of options to find the optimum. This is obviously more cognitively taxing, so instead you use the narrow heuristic—what is best for each decision at each point? 

To avoid making bad decisions because of narrow framing, Daniel Kahneman, the author of Thinking, Fast and Slow, recommends adopting risk policies: simple rules to follow in individual situations that give a better broad outcome. Examples:

  • Check your stocks only once a quarter. Don’t trade on emotion.
  • Always take the highest possible deductible for insurance.
  • Never buy extended warranties.

TITLE: Thinking, Fast and Slow
AUTHOR: Daniel Kahneman
TIME: 66
READS: 53.8
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2019/11/thinking_cover.jpg
BOOK_SUMMARYURL: thinking-fast-and-slow-summary-daniel-kahneman
AMZN_ID: B00555X8OA

Confirmation Bias

Confirmation bias is the tendency to search for and favor information that underscores our existing beliefs. In decision-making, confirmation bias causes us to ignore information that supports our non-preferred options, even if those options are better.

For instance, imagine a man is deciding whether to rent a home in Neighborhood A or B. His friend, who lives in Neighborhood A, once told him a harrowing story of a break-in at her house. As he researches the two neighborhoods’ safety, his confirmation bias kicks in. Without realizing it, he searches for information that supports his hunch that Neighborhood A is unsafe. He doesn’t seek out information about safety in Neighborhood B. He then chooses Neighborhood B, even though Neighborhood B’s crime rates are much higher.

One way to counteract confirmation bias is to look for information that contradicts your existing beliefs. When we consider opposing information, we force ourselves to pay attention to high-quality information that we’d otherwise ignore due to confirmation bias.

A technique to accomplish this strategy is to find an expert on the topic of your decision and ask questions that welcome an opposing viewpoint. For example, imagine a high school graduate who must choose among several colleges. One college tops their list because they believe it has a strong art program. They meet with an admissions counselor from that college to learn more. Their confirmation bias tempts them to ask questions that confirm their belief that the art program is strong. Instead, they can seek out an opposing viewpoint by asking, “What are some issues with the art program?”

Prepare for Future Outcomes of Your Decisions

While knowing the factors that steer our decision-making in the wrong direction is helpful, it’s also important to know how to prepare for the implications of a bad decision. In their book Decisive, authors Dan and Chip Heath share several strategies to prepare for future outcomes of our decisions. 

Strategy 1: Make Contingency Plans

One way to prepare for the outcomes of your decisions is to prepare for both best-case and worst-case scenarios. By planning for both types of scenarios, we ensure that the worst-case scenario won’t be devastating and that we’ll be more prepared for the success of the best-case scenario.

For example, consider a woman who decides to quit her job and open a food truck. First, she imagines a worst-case scenario and makes a plan for it:

  • Worst-case scenario: An accident totals her food truck.
  • Plan: She guards against this outcome by purchasing insurance.

Next, she imagines a best-case scenario and makes a plan to be ready for it:

  • Best-case scenario: A food critic’s rave review doubles her number of customers.
  • Plan: She contacts several friends who work in food service. She tells them that if her demand starts to double, she’ll offer them a job to join her in the kitchen.
This strategy is similar to Annie Duke’s technique of “scenario planning” she describes in her book Thinking in Bets. Scenario planning is an exercise in which you imagine every potential outcome of your decision. Duke adds a step, however: After you imagine each outcome, estimate its likelihood. Duke claims that this exercise helps you make a more rational decision. In contrast to the Heath brothers’ strategy, Duke intends for scenario planning to happen before the decision is made. Therefore, these two strategies could both be worked into a decision-making process: Scenario planning could serve as a tool to inform a decision, and the Heath brothers’ contingency planning strategy could serve as a tool to prepare for its outcomes.

TITLE: Thinking in Bets
AUTHOR: Annie Duke
TIME: 25
READS: 127.1
IMG_URL: https://www.shortform.com/blog/wp-content/uploads/2021/05/thinking-in-bets-cover.png
BOOK_SUMMARYURL: thinking-in-bets-summary-annie-duke
AMZN_ID: XYZ

Strategy 2: Prepare for the Unexpected 

When it’s challenging to accurately predict a decision’s worst-case outcomes, we can still prepare for the unforeseeable by working “safety factors” into our predictive calculations. A safety factor is a cushion we can add to our prediction of the worst-case scenario. If it turns out that the worst-case scenario was too optimistic, then the safety factor helps us avoid disaster. 

For instance, imagine that the food truck owner estimates her business expenses. She multiplies those estimated expenses by a safety factor of 1.5. That way, if it turns out her estimate fails to account for unexpected costs, she avoids draining her bank account.

Strategy 3: Create an Alert 

A final step we can take to prepare for future outcomes is to ensure that we act early if we notice any signs that our decision is headed towards a negative outcome. To that end, the Heath brothers recommend creating an alert: a reminder for us to act that fires in response to early signs of a negative outcome. Alerts ensure that we recognize when it’s time to follow up our decision with another decision that steers us off the path toward a negative outcome.

One type of alert that the authors offer is a deadline. After you make a decision, choose a date in the near future. When that date arrives, evaluate the success of your decision so far and determine whether you need to change course.

Another type of alert that the authors offer is a pattern. After you make a decision, list several possible warning signs that the decision is headed toward a negative outcome. If you notice a pattern of more than one warning sign, it’s a signal that you need to either re-evaluate your decision or follow it up with another one.

For example, imagine a teacher in a classroom that has several students who are new to the school. He wants to ensure that they’re integrating well with their peers, and he determines that sitting alone at lunch would be a sign of poor integration. Each day, he observes the students eating. If he ever notices that a student sits alone more than two times, this pattern alerts him to decide how to better support them.

Final Words

Life is full of big decisions, but hard-to-detect flaws in our thinking often prevent us from making good choices. No one is immune from making bad decisions, but everyone can learn how to make better decisions by working to overcome their biases and illogical ways of thinking. 

If you enjoyed our article about making bad decisions, check out the following suggestions for further reading: 

Nudge

Every day we face choices—what to order at a restaurant, what clothes to buy at a store, what show to stream after work. We make these choices without realizing how the way they’re presented affects us. If grocery stores didn’t stock candy at the register, would we eat less of it? If we had to “opt out” of being organ donors rather than “opt in,” would the organ donor pool grow?

In Nudge, Nobel Prize-winning economist Richard Thaler and legal scholar Cass Sunstein examine how the way choices are designed and structured can “nudge” us toward better decisions. 

Six Thinking Hats

In Six Thinking Hats, doctor and psychologist Edward de Bono takes the phrase “put your thinking cap on” to a new level. As De Bono explains, our normal thinking process is a hopeless tangle of six different types of thinking. We can avoid making bad decisions by untangling these six thinking types (symbolized by six hats of different colors) and deploying them more consciously.

Six Thinking Hats will teach you how to incorporate factual, emotional, critical, constructive, creative, and metacognitive information into your thinking process, along with strategies you can use to generate ideas in each of these modes. If you’re looking for ways to dramatically cut your decision-making time, calm your inner critic, or increase your team’s creativity, the Six Hats method can help.

Making Bad Decisions: When Your Logic Turns Against You

Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .

Here's what you’ll get when you sign up for Shortform :

  • Complicated ideas explained in simple and concise ways
  • Smart analysis that connects what you’re reading to other key concepts
  • Writing with zero fluff because we know how important your time is

Darya Sinusoid

Darya’s love for reading started with fantasy novels (The LOTR trilogy is still her all-time-favorite). Growing up, however, she found herself transitioning to non-fiction, psychological, and self-help books. She has a degree in Psychology and a deep passion for the subject. She likes reading research-informed books that distill the workings of the human brain/mind/consciousness and thinking of ways to apply the insights to her own life. Some of her favorites include Thinking, Fast and Slow, How We Decide, and The Wisdom of the Enneagram.

Leave a Reply

Your email address will not be published.