What is loss aversion? Do you think it’s rational to be loss-averse?
In psychology, loss aversion is a cognitive bias whereby individuals would rather avoid losses than acquire gains. Most psychologists describe humans’ sense of loss aversion as irrational. They justify this with math—studies show that people are willing to overpay to insure against financial loss. However, according to Nassim Taleb, loss aversion is an adaptive phenomenon because it helps us survive.
In this article, we’ll explore Taleb’s take on loss aversion.
Loss Aversion Helps Us Avoid Ruin
In psychology, loss aversion is considered a cognitive bias—that is, an error in thinking. According to Nassim Taleb, however, loss aversion is not necessarily a bad thing. He argues that this mental bias helps us survive. He takes into account one factor that, he claims, invalidates the conclusions of most social scientists who attempt to deal with probability: the effect of “ruin.”
“Ruin” is a state of loss that you can’t come back from. If a business suffers enough losses that it’s forced to shut down, it’s ruined. Even if its profits would skyrocket in the next quarter, it doesn’t matter. The business has suffered a permanent loss.
Opportunities for ruin are all around us, yet they’re largely ignored when academics try to analyze risk. It’s impossible to mathematically calculate risk versus reward when ruin is a potential outcome because no benefit could outweigh the finality of ruin.
Taleb uses the example of Russian Roulette. Imagine someone offers you one million dollars to load one bullet into a six-chamber pistol, spin the cylinder, and fire at your own head. Traditional cost-benefit analysis would conclude that, on average, you can expect to make $833,333. That doesn’t sound too bad! In reality, however, almost no one would take this deal. Cost-benefit analysis leads to invalid conclusions if you’re risking a permanent loss.
With this in mind, the human instinct to overcompensate and avoid risks appears to be more rational. Even if there’s only a tiny chance of total ruin, it’s worth it to take precautions against it. For example, only about 5% of home insurance owners ever successfully file a claim—yet around 85% of homeowners insure their homes. On average, you’d come out ahead financially if you refuse to get home insurance, but it’s worth it to hedge your bets against ruin.
Additionally, risks are cumulative toward ruin. Our tendency to overestimate loss is a reflection of all the risks we bear, across our lives, not just any individual risk. If you lack loss aversion and take many undue risks—for example, you frequently forget to pay your credit card bills and hurt yourself parkouring—the costs add up, and you could find yourself in financial ruin. A hypersensitive aversion to small losses helps avoid a cumulative permanent loss.
|Why Taleb Hates Nudge|
As we’ll see, Taleb uses the rationality of loss aversion to explain his view of religion—but of course, he can’t help insulting economist Richard Thaler again on the way there, calling him a “creepy interventionist.” Why does he resent Thaler so much?
In his book Nudge, Thaler argues that those offering choices should “nudge” people to make better, more “rational” decisions, without forcibly coercing them. Thaler uses loss aversion as an example of “irrational” behavior that should be discouraged—for example, he argues that employers should make their employees’ retirement accounts operate by a fixed, wise investment strategy by default, as loss aversion makes employees who are forced to choose their own strategies make irrational decisions that end up costing them money. Taleb argues that such discouragement would prevent people from creating strategies that protect themselves from ruin.
Taleb further details his argument against Richard Thaler’s “Nudging” in his Scala Politica. His main problem is that since those doing the “nudging” lack skin in the game, they’re not held accountable for any unforeseen side effects of their large-scale intervention. In Taleb’s eyes, these side effects are inevitable due to the complexity of large-scale systems.
Because groups operate differently at different scales, Taleb makes the point that behavior that is rational for an individual may be irrational for the collective. He uses the example that if people are “nudged” to invest their retirement accounts in the same basket strategies that provide the safest benefits for the individual, the collective as a whole will suffer from a lack of diversity in strategy. The potential damage of one Black Swan event multiplies.
Taleb asserts that there’s no reason to ever risk ruin directly. There is always a way to take risks without enabling the possibility of ruin. This is why Taleb opposes the consumption of genetically modified crops. In his eyes, there are many other ways to feed the world that lack potential unknown dangers.
(Shortform note: This idea is summed up in Taleb’s “barbell strategy” from The Black Swan. Instead of investing in a single “moderately risky” venture, invest a lot of money in an extremely safe way and the rest in extremely aggressive, risky ventures with the potential for huge upside. By pursuing both extremes, you protect yourself from the rare risk of ruin and open up the possibility of an extremely rare windfall.)
———End of Preview———
Like what you just read? Read the rest of the world's best book summary and analysis of Nassim Nicholas Taleb's "Skin in the Game" at Shortform.
Here's what you'll find in our full Skin in the Game summary:
- Why having a vested interest is the single most important contributor to human progress
- How some institutions and industries were completely ruined by not being invested
- Why it's unethical for you to not have skin in the game