A smiling man with curly hair pointing at his head and looking off to his right

Why does false information spread so quickly online? What if you could protect your mind from manipulation before you even encounter it?

In Foolproof, Sander van der Linden offers a groundbreaking solution. The social psychologist’s research reveals how to build mental immunity against deception using techniques borrowed from medical science.

Read more for our overview of the book Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity.

Overview of Foolproof (Sander van der Linden)

In Foolproof, Sander van der Linden presents a new framework for understanding and combating misinformation. His central insight is that false information functions like a virus: It infects our thinking, exploits vulnerabilities in our cognitive “immune system,” and spreads rapidly from person to person through social networks. Van der Linden, a social psychologist, also argues that if misinformation behaves like a virus, we can develop psychological vaccines against it. Through a technique called “psychological inoculation,” he demonstrates how exposure to weakened forms of misinformation techniques can build mental immunity, helping us recognize and resist manipulation before we encounter it.

The threat of misinformation has grown as falsehoods increasingly shape public discourse, election outcomes, and even public health decisions. Drawing on his expertise as the director of the Cambridge Social Decision-Making Lab and years of psychological research, van der Linden offers a solution that goes beyond traditional fact-checking. Rather than trying to correct false beliefs after they’ve taken hold, he presents evidence-based strategies for preventing misinformation from infecting minds in the first place. 

Our overview of Foolproof explores van der Linden’s framework in three sections: First, we explain what misinformation is and why it’s so dangerous in today’s digital environment. Next, we examine why humans are cognitively vulnerable to misinformation and why traditional approaches to combating it often fail. Finally, we present van der Linden’s inoculation strategies and how they can build immunity at both individual and societal levels.

What Is Misinformation, and Why Is It Dangerous?

We’ve all encountered false information online—perhaps a misleading headline about climate change, a doctored image during an election, or a conspiracy theory about a public health measure. While it might seem like these examples are just annoying distractions, van der Linden argues that misinformation functions much like a biological virus: It infects minds, spreads between people, and can cause serious damage. Van der Linden asserts that this comparison can help us understand why fact-checking often fails and why we need preventative approaches. Similar to the strategies public health experts use to combat disease outbreaks, we can prevent false information from taking hold and doing harm.

Van der Linden distinguishes between two kinds of false information: “Misinformation” refers to any information that’s false or misleading, regardless of intent. This could include honest mistakes, misunderstandings, or information that contains some truth but is presented in a way that leads to incorrect conclusions. “Disinformation” is misinformation that people deliberately create and spread with the intent to deceive others. This includes propaganda, strategic deceptions, and coordinated manipulation campaigns. Van der Linden focuses primarily on misinformation because both types operate similarly once they begin to spread, and both can cause significant harm regardless of the original intent.

Why Misinformation Causes Real-World Harm

Van der Linden explains that misinformation harms society through several fundamental mechanisms that extend beyond simple factual errors.

Misinformation undermines informed decision-making. Van der Linden explains that when people act on false information, they might make choices that aren’t aligned with their actual interests or values. This applies whether someone is deciding how to vote, which medical treatment to pursue, or how to prepare for environmental changes: The damage occurs when misinformation prevents people from accurately understanding the consequences of their choices.

Misinformation erodes trust in essential institutions. When false information circulates widely, people become uncertain about what or whom to believe. This uncertainty can lead to a generalized skepticism, where we naturally distrust even legitimate sources of information. Van der Linden explains that this erosion of trust damages the functioning of critical social institutions, from healthcare systems to electoral processes.

Misinformation increases polarization and social division. False information often portrays complex issues in simplistic, divisive terms that frame disagreements as battles between good and evil rather than differing perspectives that might find common ground. Van der Linden explains that this framing makes compromise more difficult and renders collaborative problem-solving nearly impossible, regardless of which side of an issue someone supports.

Misinformation can incite harmful action or dangerous inaction. At its most extreme, misinformation can lead to violence, such as when false rumors trigger mob attacks on innocent people. But it can also prevent necessary action on pressing problems by creating doubt or confusion about the established facts, delaying responses to threats that affect everyone.

How Misinformation Acts Like a Virus

Van der Linden builds on research in epidemiology (the branch of medical science that studies how health and disease are distributed throughout a population) and memetics (the study of how ideas spread) to explain that misinformation spreads through social networks following patterns remarkably similar to disease outbreaks. This occurs in three stages: infection, transmission, and replication.

Infection occurs when false information latches onto our minds and alters our understanding. Once “infected” with false beliefs, these ideas integrate themselves into our knowledge networks and influence our thinking. Transmission happens when people share misinformation through conversations, social media, and messaging apps. Digital platforms enable person-to-person spread without requiring physical contact, making misinformation even more contagious than biological viruses. Replication takes place when people slightly modify misinformation as they share it, making it more compelling or tailored to their audience, similar to how viruses mutate as they spread.

Why Are We Susceptible to Misinformation?

Understanding how misinformation spreads leads to a critical question: Why are we susceptible to these pathogens in the first place? You might consider yourself a rational person who wouldn’t fall for obvious falsehoods. Yet research shows that nearly everyone is vulnerable to misinformation under the right circumstances—regardless of education, intelligence, or political affiliation. Van der Linden explains that our susceptibility stems from fundamental aspects of how our brains process information.

Our minds evolved systems for making quick decisions in complex environments, but these same systems create vulnerabilities that misinformation can exploit. In this section, we’ll explore three key factors that make us susceptible to false information: the cognitive shortcuts we rely on daily, the powerful role emotions play in our information processing, and the social influences that shape what we believe. Understanding these vulnerabilities helps explain why traditional approaches to combating misinformation often fail to prevent or contain its spread.

Our Cognitive Shortcuts Trip Us Up

First, we rely on cognitive shortcuts to navigate the world’s complexity, but they leave us open to manipulation. For instance, the illusory truth effect causes us to believe information simply because it’s familiar. Van der Linden explains that each time we encounter a claim, our brain processes it more fluently, and we mistakenly interpret this ease of processing as a signal of truth. In experiments, people rate statements they’ve seen before as more likely to be true than new statements, even when they initially recognized those familiar statements as false.

Confirmation bias drives us to seek out and accept information that confirms our existing beliefs and scrutinize or reject evidence that contradicts those beliefs. During election controversies, people selectively consume and share news that supports their preferred narrative while dismissing opposing viewpoints.

Pattern recognition, which van der Linden notes is an essential skill for making sense of our world, can go awry when we face complex situations. He explains that our brains are designed to find meaning and connections, so we sometimes think we see patterns where none exist. For example, when major tragedies occur, conspiracy theories flourish precisely because they offer pattern-based explanations for events that seem too significant to be random.

Our Emotional Responses Shape Our Perception of Information

Beyond cognitive shortcuts, van der Linden identifies our emotional processing systems as a second major vulnerability to misinformation. Van der Linden explains that emotional content—especially material that triggers feelings of fear, anger, or moral outrage—captures our attention, bypasses our critical thinking skills, and is easier for us to remember and share over time. Analysis of social media reveals that false news stories spread significantly faster than true ones precisely because they typically contain more novel, surprising, and emotionally provocative content.

What researchers call identity-protective cognition leads us to reject information that threatens our social identity or worldview, regardless of its factual accuracy. Van der Linden explains that you can often see this at play during politically charged debates on issues like gun control or immigration, where people interpret identical statistics in completely different ways depending on their political affiliation and how they see the world.

Social Influences Keep Us From Evaluating Information Objectively

Third, our social nature creates additional vulnerabilities to misinformation. Trust networks shape what information we accept. Van der Linden explains that we’re more likely to believe and share information from people we trust—particularly friends, family, and respected community figures—so misinformation that enters our network through those people bypasses many of our skepticism defenses.

Social proof, where we look to others’ actions to shape our idea of what’s true or correct, also influences our judgment. When multiple people in our social circles believe something, we’re more likely to accept it without verification. Van der Linden points out that this “everyone is saying it” effect often proves particularly strong for claims that are difficult to verify through your own experience.

Misinformation Deliberately Manipulates Our Beliefs

Van der Linden identifies several key techniques that misinformation producers commonly use to manipulate beliefs. These techniques aren’t random—they’re carefully designed to exploit our cognitive and emotional vulnerabilities, which makes them particularly effective at spreading misinformation. 

Attacking credible sources undermines the legitimacy of information providers. Van der Linden explains that misinformation campaigns often begin by discrediting scientists, journalists, or fact-checkers rather than directly addressing the evidence behind a story. For example, climate change denial efforts frequently attack the integrity and motives of climate scientists, creating doubt about their findings without engaging with the actual data.

Exploiting emotions bypasses rational thinking. By using emotionally charged language and imagery, misinformation triggers strong feelings—particularly fear, anger, or moral outrage—that overwhelm critical analysis. Anti-immigration campaigns, for instance, frequently highlight isolated violent incidents to provoke fear, leading people to overlook broader statistical realities about immigration’s effects.

Creating false divisions forces complex issues into simplistic frames. According to van der Linden, misinformation often portrays nuanced topics as black-and-white conflicts between good and evil. This framing eliminates the middle ground and reduces the likelihood of finding a reasonable compromise. This polarization technique also transforms factual disagreements into identity-based conflicts, making people less willing to consider alternative viewpoints.

By impersonating authorities, misinformation producers borrow credibility from trusted sources. They create fake expert credentials, mimic legitimate organizations, or design websites that look like established news outlets. Van der Linden notes that the tobacco industry famously created research institutes with scientific-sounding names to cast doubt on the health risks of smoking, exploiting people’s tendency to trust apparent expertise.

Constructing conspiracy theories enables people to offer alternative explanations for complex events. Van der Linden explains that these narratives typically involve secret plots by powerful entities, appealing to our pattern-seeking minds and our desire for simple explanations. Conspiracy theories are particularly effective because they’re self-sealing—evidence against the conspiracy is interpreted as further proof of its existence and reach. They also create a sense of special knowledge among believers, reinforcing group identity and resistance to correction.

Provoking reactions deliberately disrupts constructive conversation. Some misinformation campaigns intentionally use inflammatory content to generate outrage, derail productive discussion, and increase social division. During elections, these operations often target sensitive issues specifically to heighten tensions and undermine meaningful democratic discourse.

Social Media Amplifies Our Susceptibility

While misinformation has existed throughout history, van der Linden argues that today’s digital environment makes false information uniquely dangerous. Unlike traditional media, with its editorial gatekeepers and fact-checking processes, social platforms allow anyone to instantly publish content to potentially massive audiences. This democratization of publishing brings many benefits, but it also eliminates quality controls that once filtered out the most egregious falsehoods. Van der Linden explains that when this democratic publishing ability is combined with algorithms that maximize engagement rather than accuracy, the result is an information ecosystem that rewards sensationalism over truth.

Our modern information environment interacts with our psychological vulnerabilities in three potentially treacherous ways. First, van der Linden contends that its constant stream of information overwhelms our ability to evaluate each claim, forcing us to rely more heavily on the cognitive shortcuts that already make us susceptible to misinformation. Second, social media platforms specifically amplify emotional content, which spreads faster than neutral information, thus exploiting our emotional vulnerabilities more effectively than traditional media. Third, the social validation that once helped us identify reliable information now works against us—when we see false information shared by multiple connections, our natural trust in our social networks leads us to accept misinformation more readily. 

Social media also creates what van der Linden calls echo chambers and filter bubbles—environments where we primarily encounter information that aligns with our existing beliefs. Within these isolated information spaces, misinformation that confirms group narratives spreads rapidly while corrections from outside sources are easily dismissed. As our information diets narrow, we become more confident in our beliefs while being exposed to fewer diverse viewpoints—a combination that makes us more susceptible to manipulation.

Traditional Approaches to Curtailing Misinformation Fail

Given these vulnerabilities, van der Linden explains that conventional responses to misinformation—primarily fact-checking and debunking—often fall short despite good intentions. The fundamental problem is timing. Misinformation can reach millions of people before fact-checkers even identify it. By the time corrections appear, the false information has already influenced decisions, shaped opinions, and become embedded in people’s thinking. Studies consistently show that corrections reach only a fraction of those who saw the original falsehood—one analysis found that the average fake news story on Facebook reached about 960,000 users, while the related fact-check reached fewer than 30,000.

Even when corrections do reach people, they face significant psychological barriers. Our brains don’t simply delete misinformation when presented with corrections. Instead, van der Linden reports, the false information continues to influence our thinking. When researchers debunked false claims about weapons of mass destruction in Iraq, they found that people continued to reference these debunked claims when reasoning about the war. This happens because misinformation becomes integrated into our mental models, making it difficult to fully extract even when we consciously recognize it as false.

Sometimes corrections can backfire. When fact-checkers repeat a myth in order to debunk it—for example, stating “there’s no evidence that vaccines cause autism”—they inadvertently increase familiarity with the false claim. This familiarity can later be mistaken for truth—a phenomenon known as the familiarity backfire effect. Even more challenging is the worldview backfire effect, where corrections that threaten deeply held beliefs or identities can actually strengthen attachment to the misinformation rather than correct it.

The challenges facing traditional fact-checking approaches create what van der Linden describes as a perfect storm: Misinformation exploits our psychological vulnerabilities to spread rapidly through social networks, while these same vulnerabilities make after-the-fact corrections ineffective. It’s like fighting a virus that has evolved both to spread efficiently and to resist available treatments. This reality points to the need for a fundamentally different approach—one that focuses on prevention rather than cure.

How Can We Build Immunity Against Misinformation?

If misinformation spreads like a virus and exploits our cognitive vulnerabilities, how can we protect ourselves and our communities? Van der Linden proposes a preventative approach called psychological inoculation—a method that builds mental resistance against misinformation before exposure rather than trying to correct false beliefs after they’ve taken root.

The Principle of Psychological Inoculation

The core insight of van der Linden’s approach comes from an analogy to medical immunology. Just as a biological vaccine exposes your body to a weakened form of a virus to trigger immunity, psychological inoculation exposes your mind to weakened forms of misinformation techniques to help you develop resistance. Van der Linden calls this approach “prebunking”—preparing people’s mental defenses before they encounter false information rather than debunking it afterward.

The psychological inoculation process involves three essential components: 

First, forewarning alerts people that they might be targeted with manipulation attempts. This activates their cognitive defenses and increases their vigilance, similar to how your immune system becomes alert when it detects a potential threat. 

Second, weakened exposure presents mild examples of misinformation techniques. These examples are strong enough to trigger recognition but not so powerful that they actually persuade. This is analogous to how vaccines contain weakened pathogens that can’t cause disease but still trigger an immune response.

Third, refutation explains why the technique is misleading, helping people generate their own counterarguments. This builds the mental equivalent of antibodies—specific defenses against future manipulation attempts.

The Two Types of Psychological Inoculation

Van der Linden explains that there are two ways to use psychological inoculation, depending on circumstances. The first is using fact-based inoculation to target specific misinformation by warning people about specific pieces of misinformation they’re likely to encounter, presenting weakened versions of these claims, and providing accurate information to counter them. 

For example, before an election, authorities might warn voters: “You may see claims that mail-in ballots lead to widespread fraud. These claims often cite isolated incidents while ignoring the security measures that prevent systematic fraud.” This warning, coupled with information about ballot security, helps voters resist false claims. Fact-based inoculation works like a targeted vaccine against a specific strain of a virus. It’s precise and effective against anticipated misinformation, but it doesn’t necessarily protect against new false claims on other topics. 

Van der Linden explains that a second strategy is using technique-based inoculation to teach people to recognize and resist common manipulation tactics rather than focusing on particular false claims. For instance, instead of addressing specific falsehoods about climate change, technique-based inoculation might teach people to recognize when someone’s using fake experts, emotional manipulation, or false dichotomies. Technique-based inoculation works like a broad-spectrum vaccine, protecting against multiple “strains” of misinformation. By learning to identify manipulation techniques, people develop resistance even to misinformation they haven’t specifically been warned about.

How to Inoculate Yourself Against Misinformation

Van der Linden and his team at the Cambridge Social Decision-Making Lab have developed practical tools and strategies, based on his concept of psychological inoculation, that individuals can use to build their own immunity against misinformation. These include learning about misinformation through interactive games as well as education resources.

Learn Through Interactive Games

The most effective way to build your resistance, according to van der Linden, is to generate your own mental defenses through hands-on experience. His team has created several interactive games that make this possible. For example, in “Bad News,” you take on the role of a fake news producer trying to build an audience. As you play, you learn to deploy various manipulation techniques to gain followers while maintaining credibility. This lets you experience misinformation creation from the manipulator’s perspective, helping you recognize these same techniques when you encounter them in real life.

Studies evaluating the game show that after just 15 minutes of gameplay, people improve their ability to spot manipulation techniques without becoming overly skeptical of legitimate news. The game has been translated into over 15 languages and played by millions of people. For other contexts, van der Linden’s team has developed other games: “Harmony Square” focuses on political manipulation tactics and “Go Viral!” targets Covid-19 misinformation.

Seek Out Educational Resources

If gaming isn’t your preferred approach, van der Linden’s research has informed other resources that can help you build immunity. His team’s short educational videos follow a consistent structure that mimics the inoculation process: They warn about a specific manipulation technique, explain how it works, provide memorable examples, and teach you how to counter it. In one study, a video explaining false dichotomies—using the Star Wars clip where Anakin Skywalker says, “If you’re not with me, then you’re my enemy”—improved viewers’ ability to identify this manipulation technique in real-world examples.

Practice Everyday Inoculation Habits

You can also apply inoculation principles in your everyday information consumption. Van der Linden suggests that before consuming news about controversial topics, it can help to take a moment to consider what manipulation techniques might be present. Ask yourself: Is this story trying to trigger a strong emotional reaction? Is it creating an artificial us-versus-them division? Is it using apparent expertise to bolster weak claims? This self-forewarning can activate your mental defenses before exposure. Even simple fact-checking habits can function as a form of self-inoculation. By regularly verifying information from multiple sources before accepting or sharing it, you’re training your mind to be more resistant to misinformation over time.

How to Achieve Herd Immunity Against Misinformation

The power of psychological inoculation extends beyond individual protection. Van der Linden explains that when enough people are inoculated against misinformation, it creates what he calls psychological herd immunity—a community-level protection that reduces misinformation’s ability to spread through social networks. This community immunity works through three key mechanisms: breaking transmission chains of misinformation, creating protective social networks that share inoculation with others, and building systemic defenses through institutional approaches.

Break Transmission Chains

Just as vaccinated individuals help stop diseases from spreading, inoculated people help break misinformation transmission chains. Van der Linden’s research shows that people who’ve undergone psychological inoculation are significantly less likely to share misleading content. Computer simulations demonstrate that when a critical mass of people become resistant to misinformation, its spread through a community dramatically decreases. This effect is powerful because of how misinformation typically spreads—through social sharing rather than direct exposure to the original source. If key people in a social network are inoculated, they can effectively block misinformation from reaching larger portions of the community.

Create Protective Social Networks

One of the most powerful ways to achieve widespread immunity is to share what you’ve learned about misinformation. Van der Linden encourages people to discuss manipulation tactics they’ve identified with friends and family. When you notice misinformation being shared in your social circles, respectfully pointing them out helps others develop resistance too. Van der Linden’s research on what he calls “post-inoculation talk” shows that people who’ve learned to recognize manipulation techniques often warn others about them, effectively passing on their immunity. This social sharing amplifies the effect of initial inoculation efforts, potentially reaching people who wouldn’t seek out formal prebunking resources on their own.

Build Systemic Defenses

To achieve true herd immunity against misinformation, van der Linden argues that we need to systematically implement inoculation approaches. His research suggests that to be effective, inoculation efforts need to reach a critical threshold of the population—approximately 60%, according to computer simulations. Additionally, providing inoculation before misinformation begins to circulate is more effective than gradually implementing it after falsehoods have already spread.

Van der Linden’s research has informed several large-scale applications of psychological inoculation. First, educational systems in countries like Finland and Estonia have incorporated inoculation principles into media literacy curricula, teaching students to recognize manipulation techniques from an early age. Technology companies including Google have also implemented prebunking videos on platforms like YouTube to help users identify manipulation techniques. Lastly, public health organizations used psychological inoculation during the COVID-19 pandemic to combat vaccine misinformation.

Foolproof by Sander van der Linden: Book Overview

Elizabeth Whitworth

Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books—and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.

Leave a Reply

Your email address will not be published. Required fields are marked *