What is the ludic fallacy? How does the tendency to “tunnel” into what we know lead us into making cognitive errors?
The ludic fallacy is the tendency to treat uncertainty in real life like uncertainty in games of chance. The problem with this approach is that, unlike games of chance, real life has no rules.
We’ll cover how the ludic fallacy shapes our predictions and why thinking of uncertainty like a game of blackjack is unproductive.
The Ludic Fallacy: Our Tendency to “Tunnel”
“Tunneling” describes the natural human tendency to favor knowns and known unknowns rather than unknown unknowns. In other words, our understanding of uncertainty is based almost exclusively on what has happened in the past rather than what could have happened.
The primary practitioners of tunneling are those Taleb calls “nerds”—academics, mathematicians, engineers, statisticians, and the like. Nerds are those who think entirely “inside the box”; they Platonify the world and can’t perceive possibilities that lie outside their scientific models and academic training.
Nerds suffer from the “ludic fallacy.” (“Ludic” comes from the Latin word ludus, which means “game.”) That is, they treat uncertainty in real life like uncertainty in games of chance, for example roulette or blackjack. The problem with this approach is that, unlike games of chance, real life has no rules.
Nerds aren’t the only ones guilty of the ludic fallacy, however; average people indulge it as well. For example, most people think casino games represent the height of risk and uncertainty. In truth, casino games hail from Mediocristan—there are clear and definite rules that govern play, and the odds of winning or losing are calculable. Unlike real life, the amount of uncertainty in a casino game is highly constrained. This is the rub of the ludic fallacy.
The incredible unpredictability of real life can be illustrated by a look at casinos themselves, rather than their games. Casinos spend huge sums of money on sophisticated security systems—cameras, dealer training, alarms—in an effort to prevent cheating. These systems are designed to foil cheats who use methods that are either known or expected. But the casino Taleb studied suffered its worst losses not from cheating but from completely unexpected real-life events (ironically, it suffered from the ludic fallacy even though it was dealing with actual games):
- an entertainer was attacked by the tiger with which he performed, resulting in a $100 million loss due to cancelled shows;
- an employee, for years and for no good reason, neglected to file an essential tax form with the IRS, resulting in a huge fine; and
- The casino owner’s daughter was kidnapped, and to pay the ransom, the casino owner withdrew cash from the casino’s fund.
Taleb’s study of this particular casino yielded the insight that unexpected, real-life events caused greater losses to the casino than cheating by a factor of almost 1,000 to 1. Suffice it to say, to prepare ourselves for Black Swans, we must resist the ludic fallacy and think outside the rules of the games to which we’re accustomed.
The Distortion of Silent Evidence
Tunneling and the ludic fallacy are repercussions of the Distortion of Silent Evidence.
History—which Taleb defines as “any succession of events seen with the effect of posterity”—is inevitably, necessarily distorted. That is, no matter how “factual” or “objective,” the historical record is always a product of our tendency to narrate and thus always biased.
What the narratives are biased against is randomness—the inexplicability and importance of Black Swans.
Take most CEOs’ and entrepreneurs’ (auto)biographies. These books attempt to draw a causal link between the CEO/entrepreneur’s (a) character traits, education, and business acumen and (b) later success. The “silent evidence” (which Taleb also calls “the cemetery”) in these narratives is that there are many more people with the same attributes as the triumphant CEOs/entrepreneurs who failed. The fact is, in business, like in so many other fields, the deciding factor is nothing other than luck (i.e., randomness). When we treat life a game with rules rather than a world based on luck, we’re vulnerable to the ludic fallacy.
Once we become attuned to the existence of “silent evidence”—which we can think of as the “flipside” or contrary to any story we’re told—we can see it everywhere.
For example, we base our knowledge of criminals and criminality on those that get caught (because the only criminals we know about are the ones that get caught). In other words, we have no real sense of how easy or hard it is to get away with a crime, because our notions about crime are formed solely by reports of failed criminals.
In the case of criminality, our ignorance of silent evidence actually serves a socially positive function because it makes getting away with a crime look hard and thus dissuades would-be criminals. But silent evidence can also encourage socially detrimental behaviors, such as risk-taking. This is because the successful risk-takers are the ones whose story is told; the failed risk-takers end up in the cemetery (in Taleb’s sense, but often literally as well). The ludic fallacy leads us to think there’s logic in the way the world works, but often, there’s not.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Black Swan" at Shortform. Learn the book's critical concepts in 20 minutes or less.
Here's what you'll find in our full Black Swan summary:
- Why world-changing events are unpredictable, and how to deal with them
- Why you can't trust experts, especially the confident ones
- The best investment strategy to take advantage of black swants