Seeking Wisdom: From Darwin to Munger (Book Overview)

Have you ever reflected on one of your past mistakes and wondered, “What was I thinking?” Why do we make bad decisions in spite of the human capacity for higher-order thinking?

In Seeking Wisdom: From Darwin to Munger, Peter Bevelin claims that we make significant errors because we tend to think irrationally. He argues that, fortunately, we can avoid major errors by following the wisdom of some of the world’s most rational thinkers. In his book, he collects advice on rational thinking from experts in science, business, and philosophy.

Below is a brief overview of Seeking Wisdom: From Darwin to Munger by Peter Bevelin.

Seeking Wisdom: From Darwin to Munger

Bevelin is a Swedish investor and author of four books that compile the knowledge of some of the world’s best-known thinkers. In Seeking Wisdom: From Darwin to Munger, Bevelin emphasizes how famous thinkers’ wisdom relates to business and investing. However, the conclusions he draws about human errors and the advice he provides about rational thinking are relevant to decision-making in any area of life.

In this guide, we’ll explore Bevelin’s research on mistakes and his strategies for thinking more rationally. We’ve divided this guide into three sections. First, we’ll describe the differences between rational and irrational thinking. Next, we’ll explore irrational errors we make because of the mismatch between the behaviors we evolved and the demands of modern life. Finally, we’ll examine irrational errors we make because we fail to understand and consider scientific concepts. For each error we cover, we’ll offer wisdom on how to avoid it.

Throughout our guide, we’ll compare Bevelin’s ideas on rational and irrational thinking to those of other experts, such as Donella Meadows and James Clear. Furthermore, we’ll supplement Bevelin’s strategies with additional actionable steps, such as how to calm yourself down before making a big decision.

Comparing Irrational and Rational Thinking

Bevelin argues that most of our major errors result from irrational thinking, and we can avoid these errors by engaging in rational thinking. When we’re irrational, we base our decisions on our emotion-fueled, biased assumptions rather than on facts. By contrast, when we’re rational, we override our emotions and biases by logically considering factual evidence.

(Shortform note: While Bevelin asserts that emotions interfere with wise, rational thinking, some psychologists offer the counterpoint that emotions can benefit your decision-making process in a number of ways. First, emotions can motivate you to make important decisions in the first place. For instance, your anger about an injustice may drive you to run for public office so that you can address the injustice. Second, some emotions can actually make your decisions less biased and therefore more rational. For example, research suggests that gratitude counteracts your discount rate bias: a cognitive bias that causes you to value instant gratification over long-term rewards.)

Furthermore, Bevelin claims that we should learn rational thinking strategies from the world’s most successful people: those who have achieved financial success and contributed groundbreaking ideas to society. Bevelin contends that many of these figures are successful because they use rational thinking to avoid major mistakes. 

(Shortform note: While rational thinking may play a role in some people’s success, this doesn’t mean that all successful decisions are purely rational. For example, one expert on military strategy argues that leaders’ emotions can contribute to their success, as long as those emotions align with their objectives. He claims that former German chancellor and prime minister Otto von Bismarck was a successful military strategist because his fear compelled him to act quickly during battles. This fear didn’t impair Bismark’s decision-making—rather, it supported his objectives to act offensively rather than defensively.)

Throughout his book, Bevelin draws on the wisdom of thinkers across many fields, but he particularly emphasizes the wisdom of two American investors: Warren Buffett and Charles Munger. He claims that these two billionaires are particularly disciplined in their commitment to rational thinking. Buffett, one of the world’s wealthiest people, serves as the chairman of the world’s largest public company, Berkshire Hathaway. Munger is the company’s chief executive.

In the remaining sections, we’ll explore the two main origins of our major errors: 1) an evolutionary mismatch, and 2) our failure to understand and apply scientific concepts to our decisions.

Origin 1: Evolutionary Mismatch  

According to Bevelin, the first reason why we make irrational errors is that our brains evolved to support our survival as hunter-gatherers—not our survival in modern life. There’s a mismatch between how our brains are wired to problem-solve and the types of problems we face today. 

(Shortform note: Over the past several decades, popular science authors have used the idea of evolutionary mismatch to convince people to change their behavior and lifestyle. These authors typically fall into one of two camps. There are those who, like Bevelin, claim that our minds are the problem. These authors contend that our hunter-gatherer brains are outdated, and we need to improve our thinking strategies to better meet the demands of modern life. By contrast, some authors don’t think our minds are flawed—they think our modern environment is. Writers from this camp advocate bringing aspects of our ancestral lifestyle back into our lives, such as eating a Paleo diet (a diet similar to that of our hunter-gatherer ancestors).)

We’ll begin this section by exploring the past: We’ll examine the behaviors we developed as hunter-gatherers. Next, we’ll turn to the present to explain why these same behaviors lead us to make irrational errors today. Finally, we’ll further examine these irrational errors and explore rational thinkers’ wisdom on how to avoid these errors in the future.

The Past: We Evolved to Survive a Hunter-Gatherer Lifestyle

According to Bevelin, humans have developed traits that support a hunter-gatherer lifestyle. Humans were hunter-gatherers for close to 99% of our evolutionary history. Our hunter-gatherer ancestors lived in groups, found food through foraging and hunting, fended off predators, and competed to reproduce. 

(Shortform note: In his book, Bevelin discusses the hunter-gatherer lifestyle as if it’s a relic of the past—but modern hunter-gatherer societies exist in many regions around the world. Their lifestyles differ somewhat from those of the ancestral hunter-gatherers Bevelin discusses in his book. For instance, some modern hunter-gatherers supplement their diets with cultivated food, such as the Pumé people of South America, who grow cassava. Additionally, some hunter-gatherers have incorporated technology into their daily lives: For example, the Inuit living in the Arctic use snowmobiles to travel among their hunting grounds.)

Furthermore, Bevelin emphasizes that our hunter-gatherer brains evolved to avoid pain and find pleasure—and all human brains today still have this wiring. According to Charles Darwin’s theory of natural selection, hunter-gatherers with biological traits that allowed them to avoid pain (such as attacks from predators) and find pleasure (such as food and sex) were more likely to survive and pass these traits on to their children.

(Shortform note: Recent neuroscience research supports Bevelin’s claims and Darwin’s theory that we’re driven to seek pleasure and avoid pain. One study found that two specific kinds of neurons play a role in these two behaviors. The “GABAergic” neurons associated with pleasure-seeking motivate you to continue pursuing experiences that have given you pleasure in the past. The “glutamatergic” neurons associated with pain avoidance make it more likely you’ll avoid experiences that have caused you pain.)

The Present: Modern Life Demands Different Forms of Survival

According to Bevelin, we make major errors because the situations and threats we face today are vastly different from those we evolved to survive. Hunter-gatherers mainly faced threats to their physical safety, such as hungry predators and extreme cold. By contrast, today, we primarily face threats to our social and psychological safety, such as the threat of losing a job.  

Now, we’ll consider three of the most common, consequential irrational errors we tend to make due to this evolutionary mismatch. After describing these errors, we’ll locate their origins in evolutionary history. Then, we’ll share wisdom for avoiding these errors in the future.

Irrational Error 1: Jumping to Conclusions

According to Bevelin, we tend to jump to conclusions about people and situations before we have all of the facts about them. We jump to conclusions because we have what psychologists call an association bias: We preemptively judge people and situations as “good” or “bad” based on quick mental associations we make during a first impression.

Example 1: You Make an Unwise Purchase

Imagine you’re shopping for your next car. When you see one model’s high price tag, you jump to the conclusion that it must be the best option. This is because throughout your life, you’ve associated high prices with top quality. Due to this association bias, you skip researching the car’s potential downsides before purchasing it. Several months into owning the car, you discover that its engine is prone to overheating and its hardtop often fails to retract.

Example 2: You Stereotype Someone

Imagine you’re hiring for a new position at your workplace. You’re struck by how many tattoos one of the candidates has. You jump to the conclusion that they must be rebellious because throughout your life, you’ve associated tattoos with troublemakers. Because you jump to this conclusion, you fail to notice many of the candidate’s job-related and interpersonal strengths.

(Shortform note: Psychology research reveals that we may not be aware of some of our association biases. In Biased, psychologist Jennifer Eberhardt explains that the culture we grow up in shapes our implicit bias, or the unconscious bias we form when we associate a group of people with “good” or “bad” traits. Eberhardt elaborates that implicit, antiblack bias is a pervasive problem in the US. For example, there’s a harmful stereotype that Black girls aren’t good at math. Imagine you’re a math teacher who has formed an implicit association bias between “Black girls” and “poor math performance.” If you fail to recognize and correct this bias, you might underestimate your students’ abilities and hinder their learning.)

The Evolutionary Origin of Our Tendency to Jump to Conclusions

Although this habit of jumping to conclusions often leads us to act against our best interests, we have this tendency for a reason. According to Bevelin, early humans who formed quick associations were more likely to find food—and avoid becoming food. For instance, they’d associate the rustle of leaves with the image of a lurking predator. Any time they heard rustling leaves, they wouldn’t wait to confirm the source of the disturbance. Instead, they’d jump to the conclusion that a predator was lurking nearby, and they’d quickly hide or reach for their weapon.

A Solution: Question Your First Impressions Using Backward Thinking

Bevelin argues that, fortunately, you can counteract your tendency to jump to conclusions by questioning your first impressions. Doing so prevents these first impressions—which are often incomplete or wrong—from guiding your conclusions and decisions. 

According to Bevelin, Charles Munger offers a specific strategy for questioning your first impressions: a rational thinking technique called “backward thinking.” This strategy prompts you to search for information that discredits your first impressions. It pushes you to base your conclusions on factual evidence, rather than your biased first impressions.

Let’s apply Munger’s strategy to the earlier example of the tattooed job candidate. To engage in backward thinking, look for evidence that discredits your assumption that the tattooed candidate is rebellious. For instance, inspect the candidate’s resume for evidence that they’re responsible, and ask their references to describe the candidate’s personality traits. 

(Shortform note: Daniel Kahneman’s psychology research illuminates what happens in your brain when you question your first impressions using a strategy like backward thinking. In Thinking, Fast and Slow, Kahneman explains that our mind has two systems: one (“System 1”) that automatically reacts to stimuli, and another (“System 2”) that slowly deliberates. According to Kahneman, we can avoid making irrational mistakes if we have our System 2 question System 1’s biased assumptions. Backward thinking is a way to give your System 2 time to evaluate whether you should accept the thoughts and feelings your System 1 generates.)

Irrational Error 2: Emotional Decision-Making

Bevelin also argues that we often make irrational errors because we allow our emotions to influence our decision-making. Our emotions can lead us to make impulsive, unwise decisions. Here are three examples of this: 

  • You’re enthusiastic about a new job, and you take on too many commitments.
  • Your intense sexual desire for another person leads you to cheat on your spouse.
  • Your fear that a new model of phone could sell out leads you to impulsively buy it, even though you don’t need a new phone.

(Shortform note: Although Bevelin cautions against letting your emotions drive your choices, psychologist Daniel Goleman argues in Emotional Intelligence that your emotions can help you make rational decisions by making you more aware of your preferences. He explains that knowing your emotional preferences helps you choose among competing priorities. If you ignore or downplay your emotions, you might become paralyzed by indecision or make a decision you later regret. For example, imagine you earned a bonus and you’re deciding whether to spend it on a vacation or save it for a future purchase. There are rational reasons for both options, so you must look to your emotional preferences to make a choice.)

The Evolutionary Origins of Our Emotional Decision-Making

While our habit of emotional decision-making often leads us to make hasty or unwise decisions, this habit once ensured our survival. According to Bevelin, our ancestors were rewarded for engaging in emotional decision-making. Early humans whose associations took the form of strong, quick emotional reactions, rather than slower rational thoughts, were more likely to survive danger. For instance, a hunter-gatherer who froze in fear when they noticed movement in the trees was less likely to draw the predator’s attention than another hunter-gatherer who thought through the situation logically before choosing to freeze. 

(Shortform note: In Brain Rules, biologist John Medina emphasizes the role that stress played in our ancestors’ responses to danger, and he clarifies why that stress reaction no longer serves us today. Medina claims that we’ve evolved to effectively manage acute stress (short-term stress that helps us respond to urgent threats). This is the type of stress our ancestors dealt with using quick emotional reactions. However, we haven’t evolved to handle chronic stress (long-term stress), a type of stress we experience often nowadays. Medina explains that chronic stress floods our brains with cortisol, a hormone that in high levels can cause memory loss and impair learning—two effects that may compromise our ability to make thoughtful decisions.)

Although we’re conditioned to engage in emotional decision-making, we can use rational thinking to prevent our emotions from compromising all of our decisions. Let’s explore two ways to do so.

A Solution: Keep Calm and Consult Pre-Established Guidelines

First, Bevelin claims that you should avoid making decisions under the influence of strong emotions. We make the best decisions when we take the time to rationally weigh our options. Any time you’re experiencing a strong emotion, hold off on decision-making. Focus on calming down first.

Once you’re calm, Bevelin argues that you should consult pre-established guidelines: a list of rules or steps that instruct you on how to act. Pre-established guidelines counteract the influence of your emotional state by reminding you what actions to take or avoid. Make these guidelines in advance so that you can reference them any time you’re faced with a decision. Bevelin bases his ideas on pre-established guidelines on the wisdom of Charles Munger and Warren Buffett, both of whom use this rational thinking strategy.

For example, imagine you’re someone who often splurges on expensive purchases and regrets it later. Here’s a set of guidelines you could consult before making any spending decision:

  • Don’t buy extra items just to reach the free shipping threshold for online purchases.
  • Don’t buy a duplicate of an item you already own if the one you own serves its purpose.
  • When you have several options for an important item to buy, opt for the second-least-expensive option.

Irrational Error 3: Failing to Make the Moral Choice

Finally, Bevelin argues that we often make immoral choices because we irrationally base our decisions on what other people are doing rather than on what we think is right. One reason behind this failure is that we’re afraid having an unpopular opinion will invite criticism. For example, imagine one of your colleagues makes a sexist comment during a meeting. You avoid speaking up because you don’t want others to criticize you for being overly sensitive or disloyal.

(Shortform note: There’s a term for our tendency to conform to others’ expectations of us and make immoral choices: the banality of evil. It’s the idea that ordinary people can do extremely immoral things when there’s pressure to conform or to follow the orders of authority figures. Philosopher Hannah Arendt proposed this idea to explain why ordinary Germans became complicit in the atrocities of the Holocaust.)

The Evolutionary Origins of Our Failure to Make Moral Choices

Bevelin argues that throughout our evolutionary history, our habit of following others’ behavior helped ensure our survival. Cooperating with other people increased our chances of avoiding pain and seeking pleasure. Our hunter-gatherer ancestors hunted in groups, collaborated on building shelters, and treated each other’s wounds. These ancestors learned to strive for social acceptance since inclusion in the group had many benefits. They learned to fear social exclusion, as it would reduce their access to the benefits of group cooperation. 

(Shortform note: While Bevelin emphasizes how following and cooperating with others increased our chances of survival, the opposite behavior—distrusting others—also has evolutionary value. In Talking to Strangers, Malcolm Gladwell notes that distrusting others’ intentions helps us notice when others are trying to take advantage of us. However, he adds that we nonetheless evolved the tendency to trust others by default because mutual trust and transparent communication aided our survival significantly more than constant skepticism did.)

A Solution: Recognize the Benefits of Making the Moral Choice

Although our evolutionary past hard-wired us to prioritize social inclusion over independent moral thinking, we can counteract this tendency with rational thinking. Bevelin offers a strategy from Warren Buffet that can motivate you to make the moral choice—even if it’s the unpopular choice. When you’re faced with a moral choice, ask yourself if you’d be able to live with a critical journalist writing a front-page story about your decision. For instance, consider if you’d be able to live with everyone you know reading this headline: “Worker Fails to Call Out Colleague for Sexist Comment, Claiming She ‘Hoped Someone Else Would Do It.’”

Origin 2: Failure to Understand and Apply Scientific Concepts

According to Bevelin, the second reason why we make irrational errors is that we fail to understand scientific concepts and apply those concepts to our decisions. In this section, we’ll share wisdom from math and physics that can help you think more rationally. We’ve organized these ideas into two themes: concepts that relate to interpreting evidence, and concepts that relate to systems thinking. 

(Shortform note: Although Bevelin describes our failure to understand science as an origin of our errors that’s separate from our evolutionary mismatch, these two origins may be linked. In The Art of Thinking Clearly, Rolf Dobelli argues that our hunter-gatherer minds didn’t evolve to do complex math (which serves as the foundation for many scientific concepts as well). He explains that our ancestors mostly did linear math problems, such as simple addition and multiplication problems. By contrast, modern life requires us to engage in more complex, exponential math, such as understanding interest rates and making sense of statistics such as “crime in the city is rising at a rate of 5% per year.”)

Bevelin argues that we often make irrational errors because we base our choices on the wrong evidence. In this section, we’ll share two concepts that can help us rationally select evidence, interpret it, and use it to inform our decisions.

Concept 1: Correlation Vs. Causation

First, Bevelin claims that we sometimes make unwise decisions because we assume that evidence shows causation when it really shows a correlation. When evidence reveals causation, it demonstrates that one variable causes another. When you base your decisions on causal evidence, you can feel confident that your decision will likely have a similar outcome. By contrast, a correlation is evidence that two variables are related. However, just because two variables are related doesn’t mean that one causes the other. When you base your decisions on correlative data, you may make a wrong choice or waste your efforts. 

For example, imagine you have a business that sells hand-knit sweaters. During your company’s first summer, you notice your sales are low. In response, you decide to invest extra time and money in marketing. When your sales surge from October to January, you attribute this increase to your recent marketing efforts. You then make this plan: Any time you notice a dip in sales, you’ll increase your advertising. 

However, this plan may be a waste of your time and energy: It’s possible that your advertising efforts and your increase in sales may only be correlated. Perhaps your surge in sales from October to January was due to the fact that people typically buy more sweaters during colder months, not due to your advertising efforts.

A Solution: Base Your Decisions on Evidence That Shows Causation

To base your decisions on evidence that reveals causation instead of correlations, Bevelin recommends that you look for strong evidence that one variable causes another. For example, you could consult with other knitting entrepreneurs and compare data on when your sales rose and fell. Imagine that their sales also rose in the autumn, whether or not they ramped up their advertising efforts during that period. This comparison would reveal that there’s more evidence that the weather—not your marketing efforts—caused your autumn sales to increase.

Concept 2: Case Evidence Vs. Representative Evidence

Second, Bevelin argues that we sometimes make poor decisions because we base our choices on case evidence. Case evidence is usually a single data point or anecdote that doesn’t necessarily represent the actual probability of an outcome. This type of evidence can be misleading because it gives you the impression that an outcome is more or less likely than it actually is. For example, imagine that you’re considering whether to invest in bitcoin. You hear from a colleague that her bitcoin investment is making significant returns. Her situation is an example of case evidence, and it may not reflect the actual probability of profiting from a bitcoin investment.

A Solution: Base Your Decisions on Representative Evidence

To avoid basing your decisions on case evidence, Bevelin claims that you should base your decisions on representative evidence: evidence that reflects a large sample of cases. This evidence is more likely to reflect the actual probability of an outcome. For example, you could search for data revealing whether or not new bitcoin investors are making money. A recent study reveals that three in four new bitcoin investors have lost money on their initial investments. Because this research reflects the experiences of a large representation of bitcoin investors, it’s more reliable than a single colleague’s testimony.

(Shortform note: Marketers sometimes leverage our tendency to base our choices on case evidence, as opposed to stronger representative evidence that may work against these marketers’ interests. Experts on marketing claim that you can increase your sales if you share testimonials from current customers who enjoyed your product or service. These testimonials draw in new customers who feel compelled by others’ success stories, even if those testimonials aren’t representative of the average customer experience.)

Concepts From Systems Thinking

According to Bevelin, another factor that impairs our decision-making is our failure to engage in systems thinking. This causes us to make decisions that produce unintended consequences. He explains that every action exists within a larger system, so you should always consider the far-reaching consequences of your actions. He defines a system as a complex web of interdependent components, such as a business that has multiple employees, processes, and clients. Systems thinking is a type of analysis in which you consider how interdependent parts affect the whole.

Let’s explore two concepts from systems thinking that can help us avoid unintended consequences.

Concept 1: Limiting Factors Affect the Larger System

First, Bevelin emphasizes that most systems have a limiting factor that affects the performance of the entire system. A limiting factor is an element of a system that many other aspects of the system depend on. If the limiting factor fails or is slow, then the whole system will also fail or operate slowly. When you’re trying to improve or grow a system, you should identify a system’s limiting factor, then improve it, thereby improving the system as a whole. When we fail to understand the limiting factors of the systems in our lives, our efforts to improve them are doomed to fail.

Let’s explore Bevelin’s concept of limiting factors by thinking of a common system: your home and family. Imagine you recently had a child and you hope to continue growing your family. The city you live in is a limiting factor in your home system because if you have to keep paying such high rent, you may never be able to afford a larger home to accommodate a larger family. To change this limiting factor, move to a new city with lower rent so you can afford a larger living space.

Concept 2: Actions Produce Far-Reaching, Unintended Effects  

Furthermore, Bevelin argues that because you exist in complex systems, your actions can produce far-reaching consequences, including unintended consequences. If you fail to predict your actions’ consequences, one of your decisions may cause a ripple effect that produces negative outcomes elsewhere in the system.

(Shortform note: A term people sometimes use to describe the unpredictability of consequences is “the butterfly effect.” This term was coined by a meteorologist whose weather predictions led him to conclude that a small change in a system could produce unexpected, large outcomes elsewhere in the system. He reasoned that the flap of an additional butterfly’s wings in one region of the world could contribute enough wind current to create a tornado elsewhere.)

Let’s illustrate Bevelin’s idea of unintended, far-reaching consequences using the example of the family as a system. Imagine you accept a promotion at your company so you can earn a higher income and better provide for your family. You become so busy in your new role that you spend less time at home. Your partner is left with more parenting responsibilities, and this imbalance creates tension in your relationship. Your children are upset by this tension, and they begin to act out at school.

To increase the likelihood that your actions within a system will lead to your intended outcomes, Bevelin claims that you should try to predict the far-reaching consequences of your actions. For example, before taking the promotion you’re offered, discuss with your partner how it may affect them. Then, you two could brainstorm ways to reduce the burden of parenting on your partner. For instance, you could ask other family members for support watching the kids or enroll your children in a fun after-school program.

(Shortform note: In Thinking in Systems, Donella Meadows offers a visual technique for better understanding systems, which could help improve your ability to predict your actions’ far-reaching consequences. She recommends that you draw a diagram of your system by mapping out each of its parts and how they’re connected. Doing so forces you to notice how your systems’ elements interrelate, helping you notice how one change might trigger another change elsewhere in the system. For example, if you’re a manager who’s restructuring your department, create an organizational chart that visualizes the hierarchies within your department and the relationships among your employees.) 

Bevelin offers the caveat that it’s impossible to closely examine the entire system surrounding your actions since systems are extremely complex. Rather than spending your time trying to predict every effect your actions will have (which is impossible), expect that your actions will have an unintended, negative consequence—and plan for those outcomes. Bevelin shares a strategy from Warren Buffett on how to plan for unexpected, negative outcomes: Build safety factors into your predictions. A safety factor is a buffer that you can add to your prediction. If it turns out your predictions are too optimistic, then the safety factor helps you avoid disaster. Buffett claims that his company won’t buy a stock when they estimate the value will be only slightly more than its price. Instead, they build a safety factor into their purchasing decisions by only buying a stock if they estimate its value will be significantly higher than its price.

Seeking Wisdom: From Darwin to Munger (Book Overview)

Darya Sinusoid

Darya’s love for reading started with fantasy novels (The LOTR trilogy is still her all-time-favorite). Growing up, however, she found herself transitioning to non-fiction, psychological, and self-help books. She has a degree in Psychology and a deep passion for the subject. She likes reading research-informed books that distill the workings of the human brain/mind/consciousness and thinking of ways to apply the insights to her own life. Some of her favorites include Thinking, Fast and Slow, How We Decide, and The Wisdom of the Enneagram.

Leave a Reply

Your email address will not be published.