What is silent evidence? How does ignoring it give us an inaccurate picture of the world?
Silent evidence is the “flipside” to any story we’re told. It’s the tendency to forget that there are details of a situation that we don’t have access to.
We’ll cover how the distortion of silent evidence leads to cognitive biases and why these distortions leave us surprised by and unprepared for the world’s randomness.
The Distortion of Silent Evidence
History—which Nassim Nicholas Taleb defines as “any succession of events seen with the effect of posterity”—is inevitably, necessarily distorted. That is, no matter how “factual” or “objective,” the historical record is always a product of our tendency to narrate and thus always biased.
What the narratives are biased against is randomness—the inexplicability and importance of Black Swans.
Silent Evidence Example #1
Take most CEOs’ and entrepreneurs’ (auto)biographies. These books attempt to draw a causal link between the CEO/entrepreneur’s (a) character traits, education, and business acumen and (b) later success. The “silent evidence” (which Taleb also calls “the cemetery”) in these narratives is that there are many more people with the same attributes as the triumphant CEOs/entrepreneurs who failed. The fact is, in business, like in so many other fields, the deciding factor is nothing other than luck (i.e., randomness).
Once we become attuned to the existence of “silent evidence”—which we can think of as the “flipside” or contrary to any story we’re told—we can see it everywhere.
Silent Evidence Example #2
For example, we base our knowledge of criminals and criminality on those that get caught (because the only criminals we know about are the ones that get caught). In other words, we have no real sense of how easy or hard it is to get away with a crime, because our notions about crime are formed solely by reports of failed criminals. The silent evidence is from the criminals that don’t get caught.
In the case of criminality, our ignorance of silent evidence actually serves a socially positive function because it makes getting away with a crime look hard and thus dissuades would-be criminals. But silent evidence can also encourage socially detrimental behaviors, such as risk-taking. This is because the successful risk-takers are the ones whose story is told; the failed risk-takers end up in the cemetery (in Taleb’s sense, but often literally as well).
Silent Evidence Example #3
Take, for example, Giacomo Casanova, the legendary ladies’ man and adventurer. Casanova was blessed with infallible luck—somehow he always managed to escape whatever predicament he found himself in. He believed he had a “lucky star” that watched out for him.
But for every Casanova, there are dozens of wannabe adventurers and seducers whose risks don’t pay off. This is the silent evidence. We get a skewed sense of the benefits of risk-taking because the successful risk-takers are the ones with the means to tell their tales.
(Shortform note: Taleb doesn’t acknowledge that stories of risks that fail spectacularly—from Bernie Madoff to Theranos to Lehman Brothers—do get told quite often.)
Silent Evidence Example #4
A more far-reaching variation on the concept of silent evidence is the “anthropic cosmological argument.” This argument, touted by physicists and philosophers alike, states that human existence cannot be a random occurrence because of the specificity and amount of factors that provide for that existence. In other words, the odds are so stacked against the fact of human existence, that the only possibility is that the world was created precisely to allow for our existence.
Again, what the anthropic cosmological argument suppresses is the evidence of all the other species that didn’t thrive. This is silent evidence. We assume our beating the odds is the result of destiny, but it’s really a matter of numbers: If you consider the incredible number of species competing for survival, it stands to reason one—humans, in this case—was going to win the jackpot.
Our Tendency to “Tunnel”
A repercussion of the Distortion of Silent Evidence, “tunneling” describes the natural human tendency to favor knowns and known unknowns rather than unknown unknowns. In other words, our understanding of uncertainty is based almost exclusively on what has happened in the past rather than what could have happened.
The primary practitioners of tunneling are those Taleb calls “nerds”—academics, mathematicians, engineers, statisticians, and the like. Nerds are those who think entirely “inside the box”; they Platonify the world and can’t perceive possibilities that lie outside their scientific models and academic training.
———End of Preview———
Like what you just read? Read the rest of the world's best summary of "Black Swan" at Shortform. Learn the book's critical concepts in 20 minutes or less.
Here's what you'll find in our full Black Swan summary:
- Why world-changing events are unpredictable, and how to deal with them
- Why you can't trust experts, especially the confident ones
- The best investment strategy to take advantage of black swants