What is the narrative fallacy? How does our natural tendency to tell stories get in our way?
The narrative fallacy is the cognitive bias that puts us in danger of ascribing meaning or cause to random events. Humans are evolutionarily conditioned—by the development of the left hemisphere of our brains—to reduce the complexity of the world’s information; and the most efficient way of simplifying that complexity is through storytelling.
We’ll cover the narrative fallacy, look at narrative fallacy examples, and suggest two ways to counter it.
The Narrative Fallacy
Because humans are naturally inclined to stories, with distinct causes and effects, we are perennially in danger of committing the “narrative fallacy”—the ascription of meaning or cause to random events.
Our tendency to narrativize is part and parcel of our compulsion to interpret. Humans are evolutionarily conditioned—by the development of the left hemisphere of our brains—to reduce the complexity of the world’s information (we’ll discuss why in a moment); and the most efficient way of simplifying that complexity is through interpretation. But this can lead to cognitive bias, errors–in this case, the narrative fallacy.
For example, read the following poem:
All work and no
no play makes Jack
a dull boy
Notice anything strange? (There’s an extra “no” in the second line.) When experimental subjects reading a similar poem had the left-hemispheres of their brains impaired, they were better able to notice the extra “no.” The left-hemisphere is responsible for the narrative fallacy.
Neurotransmitters in the brain, too, encourage interpretation. When patients are administered dopamine supplements, they become more likely to see patterns where there are none.
Why are humans predisposed to interpretation? For a very practical reason: It makes information easier for our brains to store. Whereas retaining 100 randomly ordered numbers would be near impossible, retaining 100 numbers that were ordered according to a specific rule would be much easier. When we interpret—or narrativize—we’re attempting to impose our own organizing rule on the random facts of the world.
How to Counter the Narrative Fallacy
So if we’re biologically and psychologically conditioned to narrativize—and, thus, to remain blind to the totally random (i.e., a Black Swan)—what choice do we have? How can we avoid the narrative fallacy? Taleb offers two suggestions:
Solution #1: Favor System 2 over System 1
Empirical psychologists like Nobel Laureate Daniel Kahneman and Amos Tversky have distinguished two types of thinking in human beings: “System 1” thinking, which is experiential, kneejerk, intuitive, and effortless; and “System 2” thinking, which is cogitative, slow, and effortful.
System 1 thinking is what kicks in when we immediately situate an event in a particular framework or pattern. For example, when we hear “no evidence of cancer” and think “evidence of no cancer,” we’re receiving the information using our System 1 resources. This may lead to the narrative fallacy.
When faced with new information, we have to make a concerted effort to engage System 2 thinking—to pause and reason about the real significance of the information.
Solution #2: Favor Experiments Over Stories
Rather than trust in “expert” analyses or attempt to extrapolate cause-and-effect narratives from newspapers—materials that are typically based on inductive reasoning—we should seek out empirical experimental evidence (for example, studies conducted by empirical psychologists like Kahneman and Tversky).
Predicting the Past
Through the limitations of inductive reasoning as illustrated by the turkey anecdote, as well as the distortions of the narrative fallacy and silent evidence, we’ve seen how problematic the past is vis-à-vis prediction. But because of these phenomena and others, the past itself is as unknowable as the future.
One of the major obstacles that prevents us from knowing the past with certainty is the impossibility of reverse engineering causes for events. That is, there’s no way to determine the precise cause of an event when we work backward in time from the event itself. But still, we fall for the narrative fallacy when looking at the past.
An example should help illustrate.
Think of an ice cube sitting on a table. Imagine the shape of the puddle that ice cube will make as it melts.
Now think of a puddle on the table and try to imagine how that puddle got there.
The second thought experiment is much harder than the first. With the right physics know-how and ample time, one could model exactly what kind of puddle will result from the melting ice cube (based on the cube’s shape, the environmental conditions, etc.). In contrast, it’s nearly impossible to reverse engineer a cause from a random puddle (because the puddle could have been caused by any number of things).
When historians propose causes for certain historical events, they’re looking at puddles and imagining ice cubes (or a spilled glass of water, or some other cause). The problem is that the sheer number of possible causes for a puddle—or a historical event—render any ascription of cause suspect. They’re committing the narrative fallacy.
Poincaré’s nonlinearities too help illustrate this problem. Again, with the right tools and time, one might be able to observe how the flutter of a butterfly’s wings in India causes a hurricane in Florida, but it would be impossible to work backwards from the hurricane to that cause—there are just too many other tiny events that may have played a part.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Black Swan" at Shortform. Learn the book's critical concepts in 20 minutes or less.
Here's what you'll find in our full Black Swan summary:
- Why world-changing events are unpredictable, and how to deal with them
- Why you can't trust experts, especially the confident ones
- The best investment strategy to take advantage of black swants