Black Swan Fallacy: Why You See What You Want to See

This article is an excerpt from the Shortform summary of "The Black Swan" by Nassim Taleb. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here.

What is the black swan fallacy? How is it most famous today?

The black swan fallacy is the tendency of people to ignore evidence that contradicts their beliefs and assumptions. This fallacy can also refer to the tendency to believe that things they’ve never witnessed don’t exist.

We’ll cover the roots of the black swan fallacy and look at modern-day, black-swan-fallacy examples.

The Black Swan Fallacy

For millennia, it was universally accepted that all swans were white. In fact, this truth was so incontrovertible that logicians would often use it to illustrate the process of deductive reasoning. That classic deduction went like this:

  1. All swans are white
  2. The bird is a swan
  3. The bird is white

But in 1697, Willem de Vlamingh, a Dutch explorer, discovered black swans while on a rescue mission in Australia—and, in an instant, a universal, incontrovertible truth was shown to be anything but.

After Vlamingh’s discovery, philosophers used the term “black swan” to describe a seeming logical impossibility that could very well end up being possible. This is the black swan fallacy.

Nassim Nicholas Taleb, however, offers a new spin on the term “black swan fallacy.” He uses it to describe specific historical events with specific impacts. These events have three salient features:

  • They are “outliers” (that is, they are statistically insignificant);
  • They have profound real-world impacts; and
  • Despite (or perhaps because of) their extreme unpredictability, they compel human beings to account for them—to explain after the fact that they were in fact predictable.

Some examples of Black Swan events include World Wars I and II, the fall of the Berlin Wall, 9/11, the rise of the Internet, the stock-market crash of 1987, and the 2008 financial crisis.

Taleb’s thesis is that Black Swans, far from being insignificant or unworthy of systematic study, comprise the most significant phenomena in human history. We should study them, even if we can’t predict them. Thus, counter-intuitively, we would be better served by concentrating our intellectual energies on what we don’t—nay, can’t—know, rather than on what we do and can know.

Taleb also claims, also counter-intuitively, that the more our knowledge advances, the more likely we are to be blindsided by a Black Swan. This is because our knowledge is forever becoming more precise and specific and less capable of recognizing generality—for example, the general tendency for earth-shattering events to be completely unforeseen (which, of course, is why they’re earth-shattering).

The Black Swan Fallacy, Turkeys, and Other Problems with Induction

The black swan fallacy is often considered a problem with inductive reasoning. Let’s look at an inductive reasoning example like the black swan fallacy.

Picture a turkey cared for by humans. It has been fed every day for its entire life by the same humans, and so it has come to believe the world works in a certain, predictable, and advantageous way. And it does…until the day before Thanksgiving.

Made famous by British philosopher Bertrand Russell (though, in his telling, the unlucky bird was a chicken), this story illustrates the problem with inductive reasoning (the derivation of general rules from specific instances). With certain phenomena—marketing strategy, stock prices, record sales—a pattern in the past is no guarantee of a pattern in the future.

In Taleb’s words, the turkey was a sucker—it had full faith that the events of the past accurately indicated the future. Instead, it was hit with a Black Swan, an event that completely upends the pattern of the past. (It’s worth noting that the problem of inductive reasoning is the problem of Black Swans: Black Swans are possible because we lend too much weight to past experience.) This is a version of the black swan fallacy.

Another example of faulty inductive reasoning, this time from the world of finance, concerns the hedge fund Amaranth (ironically named after a flower that’s “immortal”), which incurred one of the steepest losses in trading history: $7 billion in less than a week. Just days before the company went into tailspin, Amaranth had reminded its investors that the firm employed twelve risk managers to keep losses to a minimum. The problem was that these risk managers—or suckers—based their models on the market’s past performance. Again, this is the black swan fallacy.

In order not to be suckers, we must (1) cultivate an “empirical skepticism”—that is, a skepticism steeped in fact and observation—and (2) remain vigilant against the innately human tendencies that leave us vulnerable to Black Swans.

The Black Swan Fallacy and Confirmation Bias

The black swan fallacy can also refer to a type of confirmation bias. All too often we draw universal conclusions from a particular set of facts. For example, if we were presented with evidence that showed a turkey had been fed and housed for 1,000 straight days, we would likely predict the same for day 1,001 and for day 1,100.

Taleb calls this prediction the “round-trip fallacy.” When we commit the round-trip fallacy, we assume that “no evidence of x”—where x is any event or phenomenon—is the same as “evidence of no x.”

For example, in the turkey illustration, we might assume that “no evidence of the possibility of slaughter” equals “evidence of the impossibility of slaughter.” To take a medical example, if a cancer screening comes back negative, there is “no evidence of cancer,” not “evidence of no cancer” (because the scan isn’t perfect and could have missed something).

In addition to drawing broad conclusions from narrow observations, we also have a tendency to select evidence on the basis of preconceived frameworks, biases, or hypotheses. For example, a scientist conducting an experiment may, often unconsciously, discount evidence that disconfirms her hypothesis in favor of the evidence that confirms it. Taleb calls this habit “naive empiricism,” but it’s more commonly known as “confirmation bias.” Some people call it the black swan fallacy.

Taleb’s solution to naive empiricism/confirmation bias is negative empiricism—the rigorous search for disconfirming, rather than corroborating, evidence. This technique was pioneered by a philosopher of science named Karl Popper, who called it “falsification.” The reason negative empiricism/falsification is so effective is that we can be far more sure of wrong answers than right ones.

Black Swan Fallacy and the Ludic Fallacy

A repercussion of the Distortion of Silent Evidence, an element of black swan fallacy, “tunneling” describes the natural human tendency to favor knowns and known unknowns rather than unknown unknowns. In other words, our understanding of uncertainty is based almost exclusively on what has happened in the past rather than what could have happened.

The primary practitioners of tunneling are those Taleb calls “nerds”—academics, mathematicians, engineers, statisticians, and the like. Nerds are those who think entirely “inside the box”; they Platonify the world and can’t perceive possibilities that lie outside their scientific models and academic training.

Nerds suffer from the “ludic fallacy.” (“Ludic” comes from the Latin word ludus, which means “game.”) That is, they treat uncertainty in real life like uncertainty in games of chance, for example roulette or blackjack. The problem with this approach is that, unlike games of chance, real life has no rules.


Nerds aren’t the only ones guilty of the ludic fallacy, however; average people indulge it as well. For example, most people think casino games represent the height of risk and uncertainty. In truth, casino games hail from Mediocristan—there are clear and definite rules that govern play, and the odds of winning or losing are calculable. Unlike real life, the amount of uncertainty in a casino game is highly constrained. This is another version of the black swan fallacy.

Black Swan Fallacy: Why You See What You Want to See

———End of Preview———

Like what you just read? Read the rest of the world's best summary of "Black Swan" at Shortform. Learn the book's critical concepts in 20 minutes or less.

Here's what you'll find in our full Black Swan summary:

  • Why world-changing events are unpredictable, and how to deal with them
  • Why you can't trust experts, especially the confident ones
  • The best investment strategy to take advantage of black swants

Amanda Penn

Amanda Penn is a writer and reading specialist. She’s published dozens of articles and book reviews spanning a wide range of topics, including health, relationships, psychology, science, and much more. Amanda was a Fulbright Scholar and has taught in schools in the US and South Africa. Amanda received her Master's Degree in Education from the University of Pennsylvania.

Leave a Reply

Your email address will not be published. Required fields are marked *