Are you superstitious at all? Can you tell when data is meaningful and when it’s just random?
According to Steven Pinker, an important aspect of rationality is getting a good handle on probability and randomness in order to make more accurate assessments and, in turn, better decisions. He looks at two ways that people misunderstand probability and shares a handy tool to counter these flaws in thinking.
Keep reading to get a better grasp on how probability works and how randomness comes into play.
Probability and Randomness
Probability can be best understood as the chance that an event will happen given the opportunity. When someone says there is a 50% chance of an event occurring, this means it will happen 50 out of 100 times, on average. If you flip a coin 100 times, the head-to-tail ratio may not be exactly 50-50, but there’s still a 50% chance of either heads or tails with every flip.
(Shortform note: The authors of Superforecasting note that probabilities are only estimates, and predicting the occurrence of a real-world event is nearly impossible since it would require rewinding history and replaying the event multiple times to see all the different possible outcomes. Thus it’s important to keep in mind that the probabilities that most forecasters are concerned with—for example, election outcomes—are essentially their best guesses, not objective facts.)
People often miscalculate probability, and randomness frequently gets overlooked. Many times, these miscalculations lead to poor decisions. We’ll cover a few key reasons why this happens—how certain cognitive biases (or, heuristics) that we hold misdirect us, and what we can do to prevent that.
The Availability Heuristic
According to the availability heuristic, people judge the likelihood of an event based on how readily they remember it happening before, rather than by rationally calculating the probability based on how often the event actually happens. That is, they rely on what information is most available, rather than what is most representative of the truth.
A commonly cited example is when people are more afraid of flying on an airplane than driving in a car. The probability of getting injured in a car is higher than in a plane, but people remember plane crash headlines better. They therefore falsely believe plane crashes are more probable. This false belief can be harmful: Pinker notes that by irrationally preferring to drive, many have likely driven to their deaths rather than fly on a safer aircraft.
|Availability Is Caused by More Than Headlines|
The availability heuristic not only favors memorable ideas such as plane crashes but also favors ideas that are easily visualized as they capture our attention more readily. The availability heuristic drives us to also believe things we can easily visualize are more probable because they capture our attention more readily. In one study demonstrating this, some participants were asked to guess the chances of a massive flood happening anywhere in North America while others were asked to guess the chances of a massive flood happening in California due to an earthquake.
By definition, the chances of a flood occurring in California are smaller than in North America, since California is a small section of North America. But, participants rated the likelihood of a California flood far higher. Researchers posited that they did so because they could more readily picture a California earthquake-caused flood, being more familiar with that type of event, whereas a vague notion of a flood “anywhere” in North America didn’t leave them with a concrete mental image, and thus they underestimated its likelihood.
The Post Hoc Probability Fallacy
Another common probability blunder Pinker discusses is the post hoc probability fallacy. This is when, after something statistically unlikely occurs, people believe that because it happened, it was likely to happen. They fail to account for the number of times the event could have occurred but didn’t, and the probability that, given an enormous data set (an almost infinite number of events), coincidences are going to happen.
Post hoc probability fallacies are driven by the human tendency to seek patterns and ascribe meaning to otherwise meaningless or random events. It leads to superstitious beliefs like astrology, psychic powers, and other irrational beliefs about the world. It’s why, for example, if a tragedy occurs on a Friday the 13th, some will believe it’s because that day is cursed, ignoring the many Friday-the-13th dates that have passed with no tragedy, or the many tragedies that have occurred on dates other than Friday the 13th.
(Shortform note: In The Demon-Haunted World, Carl Sagan argues that supernatural and superstitious beliefs—often brought about by post hoc probability fallacies—can cause considerable societal harm because a society that holds minor irrational beliefs is more likely to hold major irrational beliefs. So, even seemingly innocuous beliefs like astrology or believing in guardian angels are potentially harmful because they lead to a more credulous and less critical society.)
Using Bayesian Reasoning to Counter Probability Fallacies
Pinker says that to prevent falling for fallacies like these, we can use Bayesian reasoning, which is a mathematical theorem named after an eighteenth-century thinker that calculates how we can base our judgments of probability on evidence (information showing the actual occurrences of an event). We won’t detail the full mathematical equations of the Bayesian theorem here, but essentially, it helps you consider all the relevant probabilities associated with a possible outcome to determine its true likelihood, which can sometimes differ greatly from the likelihood that seems most accurate.
One common use of this theorem is in determining the probability of a medical diagnosis being correct, which Pinker says is an archetypal example where Bayesian reasoning can aid in accurate assessments of probability. Let’s say you test positive for cancer. Most people (including many medical professionals) might believe that because the test came back positive, there’s an 80-90% chance you have the disease.
However, once other relevant probabilities are taken into account, the true risk is revealed to be much less than that. In this case, if the cancer in question occurs in 1% of the population (this is the evidence of the event), and the test’s false positive rate runs at 9%, then the true likelihood that you have cancer, after receiving a positive test, is just 9%.
In everyday life, we can use Bayesian reasoning without resorting to plugging in numbers by following three general rules:
- Give more credence to things that are more likely to be true. If a child has blue lips on a summer day just after eating a blue popsicle, it’s more likely that the popsicle stained her lips than that she has a rare disease causing the discoloration.
- Give more credence to things if the evidence is rare and associated closely with a particular event. If that child with blue lips has not been eating popsicles, but instead also presents with a rash and fever, the probability that the discoloration is caused by a disease increases.
- Give less credence to things when evidence is common and not closely associated with a particular event. If a child does not have discolored lips or any other symptoms of illness, there’s no rational reason to think she has the rare disease, even if some people with that disease have no symptoms.
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Steven Pinker's "Rationality" at Shortform.
Here's what you'll find in our full Rationality summary:
- Why rationality and reason are essential for improving our world and society
- How you can be more rational and make better decisions
- How to avoid the logical fallacies people often fall victim to