An artistic image of a person thinking while numbers come from their head

Why do so many people fail to solve problems effectively? How can you use data to solve your problems?

Think Like a Freak by Steven Levitt and Stephen Dubner offers a revolutionary approach to problem-solving. The authors show you how to tackle everything from personal dilemmas to global issues by adopting a data-driven mindset that questions assumptions and embraces unconventional thinking.

Keep reading for an overview of the book Think Like a Freak.

Overview of Think Like a Freak

In Think Like a Freak, economist Steven Levitt and journalist Stephen Dubner set out to help readers solve their problems by looking at data in unconventional—yet data-driven and rational—ways. They argue that when you approach problems by properly analyzing facts instead of by following emotions or morals, you’ll come up with far more effective (if unexpected) solutions. This is true whether your problems are lighthearted ones like improving your competitive-eating skills or weighty ones like reducing famine. 

The authors build on the success of their 2005 book, Freakonomics, which analyzed everyday problems through an economic, data-driven lens to question conventional wisdom. Freakonomics has sold over 4 million copies and inspired multiple award-winning podcasts, a blog, a documentary, and several “freakquels,” including SuperFreakonomics and When to Rob a Bank. In Think Like a Freak, they re-examine some of their earlier insights, such as the power of incentives and the importance of questioning common knowledge, and add practical tips for applying them to your life. 

In this guide, we’ve grouped their main points by theme and will examine their advice to:

  • Admit what you don’t know.
  • Identify the root problem.
  • Think like a kid.
  • Use incentives smartly.
  • Know when to quit.

Along the way, we examine some of the psychological roots of their ideas, connecting their thoughts to advice from other psychologists who also look at problems in unconventional ways, such as Angela Duckworth, Nassim Nicholas Taleb, and Rolf Dobelli.

Admit What You Don’t Know

Levitt and Dubner write that one reason people fail to solve problems effectively is that they don’t admit when they don’t know something. There are a number of reasons for this: People confuse beliefs with facts, over-estimate their skills, and suffer few consequences for saying things that are wrong. We’ll review each of these points and then explain Levitt and Dubner’s advice for overcoming ignorance by running experiments. 

We Confuse Facts With Beliefs 

Levitt and Dubner define facts as things that can be scientifically proven—for example, what time the sun rises. Beliefs are things we think (or feel) are true but can’t be scientifically verified—like the existence of angels. They write that we often confuse the two, thinking that our beliefs represent reality in the same way facts do. However, beliefs and reality can be grossly misaligned. 

One reason this happens is that our beliefs can be influenced by the views of others—such as politicians, religious leaders, or businesses who profit by influencing your opinions. Another reason our beliefs can be out of step with reality is that they’re shaped by our moral compass—our unshakable sense of right and wrong. Our moral compass tells us there’s only one right answer to any problem, that answer is clear and obvious, and we already know everything we need to know about it. 

When we mistakenly think our beliefs are facts, we tend to justify them by only paying attention to evidence that confirms them. Thus, rather than rationally evaluating data to discover if what it indicates is true, we filter it, ignoring what we don’t like and interpreting the rest so that it supports our position. This is why two intelligent people can examine the same data but come to completely different conclusions about, for example, the effectiveness of gun control—each person interprets the rise or fall of crime and gun ownership differently. 

We Overrate Our Skills

Levitt and Dubner write that another reason people don’t admit what they don’t know is that they overestimate their skills, both general and specialized.  

First, people assess their general knowledge and skills at everyday tasks more highly than is warranted: One study found that 80% of respondents rate themselves as better-than-average drivers—a statistical impossibility. 

Second, people rate themselves more highly than they should in areas that require specialized expertise. Experts in various fields are especially prone to this; when someone has specialized knowledge in one area, they tend to be overly confident that they have specialized knowledge in other, unrelated areas. For example, someone with an advanced degree in mathematics may give strong opinions on political matters, as if their mathematical knowledge bleeds into that field.   

Consequences for Guessing Wrong Are Small

Levitt and Dubner also argue that people rarely admit when they don’t know something because the consequences for wrongly assessing or predicting something are usually smaller than the costs of saying “I don’t know.” If you admit you don’t know something, you risk other people looking down on you. On the other hand, if you confidently state a fact, you burnish your reputation by posing as informed—whether or not you actually are. Then, if the thing you said turns out to be untrue, nothing bad typically happens. 

There are so many bad facts and predictions thrown around that most people don’t keep track of other people’s bad assessments, so anything that you say that’s incorrect quickly gets forgotten. This is why so many financial experts confidently predict the future movements of the stock market and why so few of them suffer backlash when their predictions are wrong. 

How to Experiment Effectively

Levitt and Dubner argue that the only way to truly know something is to run experiments and see what works. Experimentation gives us real-time feedback on our ideas, allowing us to know if something that works in theory also works in practice. They note that this is how humans have learned everything throughout history, from what foods are poisonous to how to build houses: We can come up with a design, but we can’t truly know if it will stand firm until we see it in action. 

They therefore advise that when asked about something you’re unsure of, the best thing to say is, “I don’t know,” and then follow up with, “But I can try to find out.” There are many benefits to this approach: The most obvious is that you’re far more likely to find the right answer to the problem. Another is that you’ll boost your credibility—people will respect your honesty, and the next time you come up with an answer, they’ll be more ready to believe you. 

Levitt and Dubner write that there are three kinds of experiments: lab experiments, field experiments, and natural experiments. 

Lab experiments are those in which you recruit volunteers to participate in a structured, controlled test. These can reveal valuable information, but they also have limits: The studies often don’t imitate real-world circumstances well enough to accurately predict what would happen outside of the lab, if people were to encounter the same experiment factors in real life. People who know they’re being observed behave differently than they would in private.

Field experiments are when you change some factor in the real world and then watch how people react. These can reveal more accurate information about human behavior than lab experiments, but they also have limitations. In particular, it’s not always possible—or ethical—to ask people to change their behavior in certain ways. For example, you couldn’t ask parents to feed their children something harmful to see what would happen. 

Natural experiments happen when a naturally occurring event changes how people operate. These allow you to see and analyze how such a change prompts people to behave differently. For example: If you’re trying to identify the effects of a certain policy, you can examine different times and places where such a change occurred to see how it might have affected lives—like whether car accidents became more or less frequent after speed bumps were added to certain roads. 

Identify the Root Problem

Levitt and Dubner discuss another key reason people have trouble coming up with effective solutions: They identify the wrong problem, which they then set out to solve instead of solving the true, core problem

This is partly because, for survival reasons, we’ve evolved to quickly identify the most obvious problem. For example, when there’s a tiger hiding in a bush, our lives depend on quickly identifying that danger, not carefully considering it. However, more complex questions have more complex, less obvious answers. But because we’re hard-wired to look for the obvious problem, we mistakenly address what seems like the problem rather than what’s the actual problem. This can lead us to treat the symptoms of the wrong issue entirely. 

Levitt and Dubner write that to find the true root cause of a problem, rather than what appears to be the cause, you must redefine the problem. This means that when most people are asking the same questions or blaming the usual causes, you should try to view the question from a different perspective and look for different causes

The authors cite, as an example, the success of Takeru Kobayashi, the Japanese competitive eater who broke world records in hot-dog-eating competitions by rethinking how to approach the challenge. Before him, competitors simply tried to chew faster and gulp larger portions. But instead of simply aiming to eat more hot dogs, Kobayashi set out to make hot dogs easier to eat. Noting that there was no rule mandating how the hot dogs must be eaten, he took them out of their buns, ate the hot dogs alone, and ate the buns after dipping them in water. By doing so, he redefined the competition as a true sport for which people can train and strategize.

Levitt and Dubner note that root causes can be difficult to spot, often because they’re the results of long-running societal wrongs. For example, famine is more than a lack of food—it’s a symptom of malfunctioning political, social, and legal establishments that lead to an unhealthy economy plagued by inequality and poverty. This is a deeper, broader dysfunction that can’t be solved by simply supplying food to affected people, which is a temporary solution addressing the symptom, not the underlying cause. 

Think Like a Kid

Levitt and Dubner advise that to effectively see problems from a new viewpoint, or to generate ideas and questions, it helps to have the mentality of a child. They argue that children inherently see the world from a new angle—not just metaphorically but also literally, since they’re generally smaller than adults—and this allows them to spot things adults overlook. 

Kids are also naturally curious, and they haven’t yet developed preconceptions and biases that stop them from seeking to learn—such as overconfidence in their own expertise. Additionally, they don’t know the “common knowledge” related to any particular problem, and they don’t know what others have tried before. This allows them to approach problems with a fresh mindset, without the self-imposed restrictions adults often work under. 

Levitt and Dubner offer a few tips on how to approach problems with the mentality of a child:

Ask About the Obvious

As adults, we typically don’t question things we learned long ago, such as set-up steps in a process or background pieces of information. We take them for granted and assume they’re true because we’re used to them. But someone learning something new will often question those things and in doing so, may reveal insights we’ve overlooked. Levitt and Dubner encourage you to emulate this—question things you haven’t thought about critically in a while.  

Generate Lots of Ideas and Edit Them Later

As you brainstorm, don’t worry about what’s good or bad. Sit on your ideas for a day and see how you feel about them—this will usually reveal which ideas are better than others. 

Have Fun

There are several ways that having fun can lead to better solutions. First, when you’re having fun, you’re more likely to examine things more closely—people who ask unusual questions are often the ones who are enjoying themselves. Second, you’re more likely to spend time on something if you’re enjoying it, and research shows that the key to success is not raw talent but practice—how many hours you put into a pursuit. Third, by using fun, you can encourage other people to behave in certain ways. For example, a teacher might encourage their students to exercise by offering prizes to the class that does the most jumping jacks. 

Use Incentives

Levitt and Dubner write that to solve a problem that involves other people—be it a personal, professional, or societal problem—you must understand how and why people respond to incentives. Incentives are at the root of all human behavior and drive everyone’s decision-making. 

Levitt and Dubner note that it can be difficult to identify a person’s true incentives. People don’t always clearly say—or admit—what really drives them. Instead they often say what they think the other person wants to hear, but later, in private, they behave as they truly want to. Economists call these declared preferences and revealed preferences. The key to successfully crafting solutions to problems is figuring out how to bridge the gap between these two types of preferences so that your incentives appeal to what people will do rather than what they think they should do. If you don’t, the solutions you come up with may not inspire people to act as you hope they will, and thus may be ineffective. 

Next, we’ll review Levitt and Dubner’s discussions of the power of herd mentality, the fact that people generally make selfish decisions, how incentives encourage people to reveal their true selves, and how using incentives can backfire. 

The Herd Mentality

Levitt and Dubner note that one of the most powerful incentives is the desire to fit into the group, often referred to as the herd mentality. They write that this desire can rival the desire for money, and they cite a study that demonstrates just how strong the herd mentality can be. The study also highlights the discrepancy between declared and revealed preferences.

Through a phone survey, researchers asked Californian residents what would motivate them to cut back on their energy use. Most people cited environmental concerns, followed by societal benefits and then financial concerns. They ranked “other people also conserving energy” last. 

However, the next phase of the experiment showed that what people actually cared about was very different from what they claimed. The researchers hung information tags on people’s doorknobs encouraging them to save energy. Each tag cited various reasons and included a related statistic, such as the fact that residents can reduce pollution by a certain amount every month, would save a certain amount of money, or that 77% of their neighbors use fans instead of air conditioning when possible. 

Then, the researchers measured the energy use of each home. They discovered that the homes receiving tags that said their neighbors were also conserving energy lowered their energy usage significantly more than any other homes. Levitt and Dubner write that when you successfully identify people’s true incentives like this, you can spur them to do something right even if it’s for the wrong reasons—like getting them to save energy just to fit in with the neighbors. 

People Make Selfish Choices

Levitt and Dubner write that when you evaluate incentives and base solutions on them, you must keep in mind that people usually act selfishly, putting their personal interests ahead of the concerns of a larger group. This human tendency poses challenges for policymakers—organizations tasked with improving society have to figure out how to counter the human instinct to work against the greater good in favor of personal benefit. 

To illustrate this instinct, Levitt and Dubner examine the thought process that soccer players go through during a penalty kick—when a kicker has a chance to shoot the ball into the goal with only the goalie to block it. The kicker can choose to kick left, right, or center, and will do so as randomly as possible to prevent the goalie from detecting a pattern. The goalie must choose to jump left, jump right, or stay in the center before the kick happens, because it takes just a fraction of a second for the ball to travel from the kicker to the goal. 

Rationally, you’d expect the kicker to choose left, right, or center approximately one-third of the time each, and same for the goalie. However, data shows that kickers kick to the center only 17 percent of the time—and goalies stay in the center only two percent of the time. Shots aimed at the center of the goal are significantly more likely to succeed, and both kickers and goalies know this. So why do kickers so seldom shoot there, and why do goalies so infrequently defend it?

Levitt and Dubner attribute this tendency to the desire on both sides to avoid embarrassment. Kickers don’t want to look like they kicked directly to a waiting goalie, should the goalie stay in that spot. And if the goalie stays in the center while the ball is kicked to either side, it will look like they didn’t bother to try. Either way, the crowd will react negatively. 

This example illustrates the power of incentives to make decisions selfishly—in this case, the players are motivated by the personal incentive not to look silly rather than the teamwide incentive to win the game. Levitt and Dubner argue that acting selfishly in this way doesn’t make anyone a bad person—it simply means they’re human, and when you’re constructing incentive systems, you must assume that people are selfish and design your incentives accordingly. 

Self-Selecting Incentives 

Levitt and Dubner write that one way you can leverage incentives to benefit your project or organization is by designing incentive schemes that encourage people to reveal their true inner motivations, even if those motivations are something they’d rather keep hidden. If you set up incentives to attract certain people in certain ways, you can get them to self-select into categories, allowing you to know what kind of people they are and how to interact with them. 

For example, if you’re hiring an employee, you can design your job application process to discourage the wrong people from applying. This is the reason some companies make their application process difficult even for entry-level jobs: It prevents people from applying who might quickly quit if they do get the job. A difficult application process is self-selecting, since it only attracts determined, diligent applicants. 

Be Careful of Incentives Backfiring

Levitt and Dubner warn that incentives sometimes have the opposite effect of what you intend: They can encourage more of the bad behavior you’re trying to prevent. One manifestation of this is if you warn people that a problem is common—instead of feeling moral outrage, they’ll often think, if everyone else is doing this, I can too. This is an undesirable consequence of the herd mentality, and was illustrated when park rangers at Arizona’s Petrified National Park posted a sign stating that 14 tons of petrified wood was stolen every year. Their intention was to deter visitors from stealing more, but instead, visitors developed a fear of missing out, prompting many to grab a piece of the forest before it all disappeared. 

Levitt and Dubner say incentives can also backfire if you offer to pay people to destroy something undesirable—they’ll often create more of it to destroy so they can get paid. This tendency is sometimes called the “cobra effect,” named after an incident in colonial India when a British overlord, concerned about the burgeoning cobra population, offered a cash reward for cobra skins. Instead of leading to fewer cobras, it gave rise to a cobra-farming industry designed around raising and slaughtering the snakes. 

A more recent example of this phenomenon was when the United Nations offered payments to manufacturers to destroy their pollutants. Manufacturers began creating more of them so they could destroy them, leading to a sharp increase in pollution. 

It can be difficult to predict unintended consequences, write Levitt and Dubner, because people are often smarter than the organizations behind incentive schemes and can be highly motivated to come up with ways to game them. In addition, people can often tell when organizations are trying to manipulate them, and they tend to rebel against that. 

Know When to Quit

Levitt and Dubner wrap up their arguments by suggesting another unconventional mindset that can keep you from chasing ineffective solutions: Be willing to quit. They write that unrelenting perseverance is often promoted as a positive trait, but it can sometimes be harmful—although persistence and tenacity are key elements of success, you can waste time, energy, and money if you get caught up in pursuing a goal that’s ultimately unattainable. Thus, it’s important to recognize when it’s best to cut your losses, and then to be willing to change your plans. Studies show that people who work toward unattainable goals feel psychologically better when they give them up. This leads to better physical health too.

Levitt and Dubner cite three reasons people resist quitting even when they should:

  • They believe quitting means failing.
  • They fall for the sunk cost fallacy.
  • They ignore opportunity costs.

We’ll examine each of these reasons in more detail, and then look at Levitt and Dubner’s advice on how to lessen the likelihood that you’ll quit a project.

Quitting Is Not Failing

Quitting does technically mean failure in that you fail to reach the goal you set out for. However, Levitt and Dubner argue that failing to immediately meet your goal isn’t always a bad thing and shouldn’t be viewed as true failure. Sometimes you need to make failures along the way so you or others can reach the ultimate goal, even if that means reaching it in a different way than you originally intended. For example, when scientists or medical researchers pursue a path that doesn’t produce results, they still make a contribution, because they let others know not to follow that path later. Quitting under these circumstances can be an important part of the process.

The Sunk-Cost Fallacy

The sunk-cost fallacy is the belief that once you’ve invested a significant amount of time, money, effort, brainpower, or other resources into something, you should keep pursuing it. People don’t like to abandon projects they’ve put many resources into because they feel it will be counterproductive. But, Levitt and Dubner argue, it’s better to cut your losses on projects that won’t pan out rather than continuing to sink resources into them. 

Opportunity Costs

Levitt and Dubner argue that people often overlook the fact that when they pursue one goal, it means they can’t pursue some others. These missed opportunities can be hard to quantify, but when you’re considering moving forward with a plan, it’s important that you think about what else you could be doing with your time instead. For example, if you’ve taken on the renovation of a dilapidated house, think about what else you could do with the money and labor it will entail. You might get a better return on, say, going back to school for an advanced degree, making it more beneficial to quit the renovation project and redirect your resources elsewhere. 

Levitt and Dubner don’t advise that you quit just to watch TV all day, but instead that you quit to take up a new goal. They advise that if you’re unhappy in your current path, don’t ignore that feeling and push through it; consider what else you might do that could lead to the same ultimate end. For example, if you’re unhappy with your chosen line of work, examine what it is about your work that you like and what you like doing outside your work. Then, try to find a direction that marries those two factors. This may bring you to your ultimate goal of having fulfilling employment, even if it means quitting your current method of achieving that goal. 

Lessen Your Chances of Quitting

Levitt and Dubner offer guidance on what you can do when approaching a project to lessen the chances that you won’t finish it: Conduct a premortem on it. This is the opposite of a postmortem, which is what organizations do to analyze what went wrong after a project has failed. With a premortem, you’ll think through all the possible points of failure in advance so you can prepare for them.

To conduct this analysis, ask everyone involved with the project to imagine that it fails, then ask them to come up with reasons for its failure. This thought experiment is effective in flushing out potential problems that otherwise may go unmentioned—issues that people may have quietly been thinking about but didn’t see a reason or an opportunity to raise. Levitt and Dubner note that if you make the premortem anonymous, people will be more honest, and will more readily bring attention to problems they may otherwise hesitate to mention. 

Exercise: Identify a Root Problem

Levitt and Dubner write that effective solutions are those that address the true, core problem of a situation, but that often, people try to solve the wrong aspect of a problem—leading to wasted time and effort. In this exercise, try to see a particular difficulty you’ve experienced with fresh eyes. 

  • Describe an issue that comes up repeatedly in your life. It might be in your professional life (such as misunderstandings between you and your manager) or your personal life (such as your kids frequently being late to the school bus in the morning). 
  • What solutions have you been using to address this issue? For example, have you been sending multiple follow-up emails at work, or raising your voice to your kids as the time to leave approaches?
  • Identify the problem that your current solutions are aimed at solving. For example, sending multiple emails might suggest you feel your manager is forgetful, and yelling at your kids suggests you don’t believe they share your urgency about making the bus. 
  • Imagine that the problems you’ve been addressing aren’t at the heart of the issue—your manager remembers things just fine, and your kids are equally stressed about being late. Consider instead what other problem may lie at the root of the issue. For example, might your manager not fully understand your deadlines? Or, are your kids approaching their morning routine in the wrong order? 
  • Now, think of alternative solutions that would address the new problems you’ve identified. For example, how could you better convey your priorities to your boss, or how might you better structure your kids’ morning routines?
Think Like a Freak by Steven Levitt & Stephen Dubner: Overview

Hannah Aster

Hannah is a seasoned writer and editor who started her journey with Shortform more than four and a half years ago. She grew up reading mostly fiction books but transitioned to non-fiction writing when she started her travel website in 2018. Hannah graduated summa cum laude with a bachelor’s degree in English and double minors in Professional Writing and Creative Writing.

Leave a Reply

Your email address will not be published. Required fields are marked *