Government officials and disaster relief personnel talking at the site of a flood

When disaster strikes, we often blame bad luck. But Niall Ferguson’s Doom: The Politics of Catastrophe contends that we’re missing the real story. Catastrophes reflect how our societies are wired—the networks, institutions, and systems that connect us.

Continue reading for our overview of the book, and discover why some crises spiral out of control while others don’t.

Overview of Doom: The Politics of Catastrophe

Historian Niall Ferguson’s Doom: The Politics of Catastrophe (2021) argues that disasters are fundamentally political phenomena rather than purely natural or random events. He contends that understanding catastrophes requires examining the social networks, institutions, and political systems that disasters put under pressure rather than focusing solely on their immediate causes. Whether we’re dealing with pandemics, earthquakes, financial crashes, or nuclear accidents, Ferguson argues that their ultimate impact depends on human decisions about preparedness and response as well as the underlying strength or weakness of our societies.

This perspective matters because our conventional approaches to thinking about disasters consistently fail us. We blame individual leaders when systemic failures occur, we try to predict unpredictable events instead of building resilience, and we misunderstand why some societies successfully weather crises while others collapse. Ferguson wrote Doom during the Covid-19 pandemic to place our recent experience in its proper context, drawing on his expertise as a financial historian and the author of numerous books on economics, military history, and global affairs. He argues that disasters reveal the true character of civilizations, and that our current institutions may be poorly equipped for future catastrophes.

This overview explores Ferguson’s analysis in three sections:

  • What disasters really are and how they spread through networks
  • How disasters become catastrophes through systemic human failures to recognize and address obvious vulnerabilities
  • How we can build better disaster preparedness that breaks this cycle

What Are Disasters?

Ferguson defines disasters as unpredictable events that become catastrophic through their interaction with human networks. To understand this definition, we have to unpack Ferguson’s framework for understanding disasters into three principles: Disasters are unpredictable; even “natural disasters” are political phenomena; and disasters spread through networks that can turn what would otherwise be manageable problems into civilization-threatening catastrophes. 

Disasters Are Unpredictable

Ferguson notes that death is the one certainty in human existence, and disasters are concentrated expressions of mortality that reveal the fragility of human civilization. But even though death is certain, it often isn’t predictable. Ferguson argues that most disasters don’t occur in regular, predictable cycles. Instead, their probability follows what statisticians call “power laws”—mathematical patterns that are different from the “normal” statistical distributions that govern most everyday phenomena. 

To understand this, says Ferguson, compare the different probability patterns for human height versus earthquake magnitude. Human height follows a “normal distribution,” a mathematical term describing how frequently different values occur in a dataset. If you graphed the probability of encountering people of different heights, you’d see that most people cluster around an average height in the middle of the graph, while the probability of encountering very tall or very short people drops off sharply at the edges. This concentration around a central average makes human height predictable: If you know the average height in a population is about 5’6”, you can predict that you’ll very rarely encounter someone over seven feet tall.

In contrast, Ferguson explains that due to power laws, there’s no “typical” disaster size. If you graphed the probability of earthquakes of different magnitudes, you wouldn’t see earthquakes clustered around some “typical” size. Instead, you’d see that while small earthquakes happen frequently, larger earthquakes are much more probable than a normal distribution would suggest. Yet a magnitude 7.0 earthquake isn’t slightly worse than a magnitude 6.0: It releases roughly 32 times more energy. And while magnitude 6.0 earthquakes happen frequently, magnitude 8.0 earthquakes, which release about 1,000 times more energy, are rare—but not impossibly rare in the way that eight-foot-tall humans are rare.

This Pattern Explains Why Disasters Consistently Surprise Us

According to Ferguson, this statistical pattern explains why disasters consistently surprise us. Power law distributions mean that catastrophic events—the ones that kill hundreds of thousands of people or reshape civilizations—occur more often than our everyday experience would suggest, but rarely enough that we’re not psychologically prepared for them. Ferguson notes that we can identify general categories of threats that are likely to happen: We know that, sooner or later, a pandemic, earthquake, or financial crash is inevitable. But we cannot predict the timing, magnitude, or specific characteristics of these disasters with sufficient precision to prepare for exact scenarios before they occur.

When a disaster does occur, we can use statistics to understand its impact. Ferguson measures disaster severity through “excess mortality”: About 160,000 people die globally every day under normal circumstances. Ferguson uses the number of deaths above and beyond this baseline to put disasters in perspective. For example, Covid-19 increased global mortality by 1.8% in early 2020, a relatively small perturbation compared to the 6th century’s Plague of Justinian or the Black Death of the 14th century, which each killed more than 30% of affected populations. Ferguson argues that understanding these scales is essential, but these patterns still can’t tell us when the next major disruption will occur or how severe it will be.

All Disasters Are Political Phenomena

While we often talk about events such as earthquakes, volcanic eruptions, and floods as “natural disasters,” Ferguson challenges the distinction we usually make between “natural” and “man-made” catastrophes. He contends that all disasters are fundamentally political, even when they originate from natural causes, because a disaster’s impact depends less on the triggering event and more on how it interacts with social, political, and economic systems. Ferguson argues that human systems determine disaster impact.

Ferguson notes that an earthquake is natural. But whether it affects comparatively few people or becomes a civilization-ending catastrophe depends on human decisions, such as where people built cities, how they constructed their buildings, what early warning systems they put in place, and how effectively authorities coordinate rescues. The 2010 earthquakes in Haiti and Chile illustrate this principle: These two earthquakes occurred within weeks of each other and had similar magnitudes. Haiti’s death toll reached over 200,000 while Chile’s remained under 1,000. The difference was political, economic, and institutional: Chile had better building codes, more effective government coordination, and superior emergency response.

Disasters Spread Through Networks

Ferguson explains that societies are organized as networks. A network is a system of connections among different points, like individuals and communities—as in transportation systems that move people and goods, communication systems that spread information, economic systems that link markets, and social systems that connect communities. Since he sees disasters as amplified or contained by human systems, Ferguson argues the structure of these systems determines how a disaster plays out. For example, a pandemic spreads when an infected person travels from one city to another, passing the disease along to the people they encounter, who then spread it to their own contacts. 

Ferguson argues that this reframes how we normally think about disasters: Rather than viewing them as external shocks to human systems, we should see disasters as manifestations of the underlying strengths and weaknesses of our political and social systems. He says that the network structures that define modern civilization enable these disasters to be contained—or to become catastrophic. If we understand the systems that support human societies for what they are—networks—we find consistent structures and predictable dynamics that let disasters wreak havoc in surprisingly unsurprising ways. 

How Do Disasters Become Catastrophes?

Not every disaster results in the loss of life or upends social and political order. Ferguson argues that disasters become catastrophes when a predictable progression of events occurs: Network structures create pathways for escalation, our psychology prevents us from recognizing these risks, institutional incentives discourage authorities from acting on warnings, and coordination failures enable cascade effects that can reshape entire civilizations. (Cascade effects are chain reactions where failures in one part of a network trigger failures in connected parts.) We’ll take a closer look at how Ferguson explains each step in this progression.

Networks Create Pathways for Disaster Escalation

Ferguson emphasizes that the same event can have completely different impacts depending on the structure and characteristics of the networks it encounters. He explains that three characteristics of networks—the central role of hubs, the speed of communication across the network, and the challenges coordinating responses among different networks that are responsible for responding to a crisis—determine whether disasters can be effectively contained or whether they spiral out of control. 

First, many networks have a “scale-free structure,” where a few highly connected hubs link to many smaller nodes. (This structure is called “scale-free” because there’s no typical number of connections: The distribution of connections follows a power law, just like disasters.) These hubs make networks efficient under normal circumstances but vulnerable to catastrophic failure. Hurricane Katrina illustrates how this works. The devastating tropical storm hit the US Gulf Coast in August 2005 and flooded 80% of New Orleans, which was a hub for energy production, shipping, telecommunications, and transportation. Effects cascaded nationwide: Energy prices spiked, supply chains were disrupted, and communication broke down.

Second, Ferguson contends that the sheer speed at which communications and coordination can travel across modern human networks is unprecedented. But these incredible speeds often prove a double-edged sword when disaster strikes. During Katrina, television reporting and radio networks enabled the real-time coordination of rescue efforts and helped keep evacuees informed. But the same communication networks amplified false reports about widespread violence that deterred rescue workers and delayed the federal response to the storm. In this sense, network speed created a timing problem: Rumors could spread faster through social networks than official information could be verified by authorities.

Third, Ferguson argues that the most critical factor separating disaster containment from catastrophic failure is how well different networks coordinate during a crisis. Poor coordination turns disasters into catastrophes, as Katrina illustrated: Coordination between weather services, media, and transportation systems enabled the largest peacetime evacuation in US history before the storm made landfall. But coordination during and after the storm failed between federal, state, and local emergency response because no single authority could manage efforts across overlapping jurisdictions and incompatible communication systems.

We Fail to Recognize Network Vulnerabilities

The networks that structure society are complex, and Ferguson argues that human psychology and institutional biases create blind spots in our understanding of these networks. A cognitive tendency called the “availability bias” causes us to focus on the most vivid aspects of disasters, such as dramatic images of flooding, rather than understanding how disasters spread through interconnected systems. This means we fail to prepare for the network effects that turn manageable problems into catastrophic failures. Ferguson calls the resulting disasters “gray rhinos,” a term Michele Wucker coined in The Gray Rhino to describe obvious, high-probability threats that strike in our blind spots.

The lack of preparation in New Orleans ahead of Hurricane Katrina shows the availability bias in action. Officials could easily imagine a major hurricane hitting the Gulf Coast—that vivid, dramatic threat was obvious to everyone. The city had hurricane tracking abilities, engineering expertise, and evacuation infrastructure focused on that clear danger. But the availability bias prevented officials from addressing the network vulnerabilities: how levee failures could cascade through the city, how communication breakdowns would prevent coordination, or how the city’s role as an energy and transportation hub would amplify the disaster’s impact. The direct threat was obvious, but the network effects were invisible until they became catastrophic.

Ferguson notes that these blind spots are compounded when we confuse uncertainty, which involves situations too complex for meaningful probabilities, with calculable risk. This leads to false confidence in disaster planning: Disasters involve uncertainty because we can’t predict which nodes of a network will fail or how failures will cascade. Yet institutions often treat these uncertainties as calculable risks they can manage through better analysis, leading to false confidence in disaster planning. Emergency management agencies assume they can predict how disasters will move through networks, but real catastrophes often exploit network pathways that planners didn’t consider or prepare for.

We Ignore Warnings About Network Risks

Even when experts understand network vulnerabilities and issue clear warnings, Ferguson notes there are systemic reasons why these warnings get dismissed. First, he points out these warnings implicitly threaten the existing institutional arrangements, budget allocations, and career paths that authorities have built around current systems. Acknowledging the vulnerabilities in our networks would require acknowledging that our institutions are inadequate for managing the risks that future disasters might pose, potentially requiring expensive reorganizations and admitting past mistakes. This creates incentives for officials to dismiss warnings rather than admit their institutions lack needed coordination capabilities.

Ferguson calls the resulting disasters “black swans,” a term introduced by Nassim Nicholas Taleb in The Black Swan. These events feel impossible based on recent experience, but they’re actually predictable, like the Covid-19 pandemic. Virologists had warned that a pandemic was inevitable and would require rapid testing, contact tracing, and international information sharing. In the US, authorities handled the 1957 Asian flu pandemic with coordination between government agencies, pharmaceutical companies, and international health organizations. But, by 2020, these relationships had atrophied. Preparing for a pandemic would have required admitting that public health institutions weren’t prepared for networked threats, an acknowledgment officials avoided.

Additionally, says Ferguson, warnings are often dismissed or lost as they pass through institutions. Officials can usually find expert opinions to support inaction because network effects are complex and uncertain. Early in the Covid-19 pandemic, some experts warned about transmission through travel and superspreader events, but others downplayed these network effects. At the same time, experts’ warnings had to travel through bureaucratic networks of middle managers at agencies such as the US Centers for Disease Control and Prevention (CDC). These managers consistently overruled or ignored recommendations about mass testing and contact tracing, demanding certainty rather than preparing for uncertain but probable risks.

Coordination Failures Create Civilization-Altering Catastrophes

Finally, even when it’s clear that disaster has struck and a decisive response to the crisis is necessary, some disasters spiral into catastrophe because authorities fail to effectively coordinate their responses. Ferguson calls these “dragon kings,” a term Didier Sornette coined to describe disasters with consequences so severe that they reorganize entire societies. Dragon kings emerge when coordination failures create cascade effects that overwhelm a society’s capacity to adapt. Unlike typical disasters that damage existing systems, dragon kings reorganize network relationships and create new vulnerabilities that can generate catastrophes for generations, ultimately reshaping the basic structures of human civilization.

World War I demonstrates this pattern: Ferguson notes the war began as a regional crisis (an assassination in Sarajevo), but diplomatic and military coordination failures transformed it into a global catastrophe. European alliances lacked effective crisis management mechanisms, while military networks prioritized speed over flexibility, effectively eliminating diplomatic solutions. The war can be classified as a “dragon king” disaster because it destroyed four empires and created dozens of new nation-states. Furthermore, the flawed institutions created to manage the post-war world established the unstable economic and political relationships that contributed to later catastrophes, including the Great Depression and World War II.

How Can We Build Better Disaster Preparedness?

Ferguson’s analysis of how disasters become catastrophes explains why societies fail at disaster response: Disasters become catastrophes because of a mismatch between the vulnerabilities created by our networks and the ways our institutions are designed to recognize and address those vulnerabilities before they become catastrophic. Understanding this dynamic raises a question: Given these systemic limitations, how can societies build better disaster preparedness? Ferguson explains that to break the cycle, we have to build systems that can spot vulnerabilities before they become gray rhinos, act on warnings before they become black swans, and coordinate effectively across networks to prevent dragon kings.

Spot Network Vulnerabilities Before They Become Gray Rhinos

Gray rhino disasters happen because institutions have blind spots that prevent them from seeing obvious risks. Ferguson recommends that we create systems that face uncomfortable truths and have formal ways to identify and discuss vulnerabilities that threaten them. Democratic societies have historically learned from successes and failures through elections, media scrutiny, and public debate. But Ferguson argues that modern information networks optimize for engagement rather than accuracy, and political polarization prevents information flow between different communities. Effective vulnerability assessment requires us to rebuild ways to process conflicting information without becoming paralyzed by disagreement.

Ferguson also contends that it’s crucial to distribute risk assessment across multiple perspectives: Rather than relying on single institutions to identify risks, resilient societies spread the responsibility for recognizing vulnerabilities across multiple organizations. In the US, for example, the CDC’s monopolization of Covid-19 testing created a single point of failure that prevented an effective early response. Conversely, the countries that were most successful at managing Covid-19—such as Taiwan and South Korea—drew on diverse institutions that could spot vulnerabilities from different positions within their systems.

To prevent gray rhinos, Ferguson argues that we need to make it rewarding for authorities and institutions to acknowledge risks. Gray rhinos happen because institutions benefit from arrangements that create vulnerabilities. Effective preparedness requires creating career incentives for officials who identify and address vulnerabilities before they become crises. This might involve creating roles specifically responsible for challenging how existing networks function, with career advancement tied to successfully identifying vulnerabilities.

Act on Warnings Before They Become Black Swans

Black swan disasters happen because institutions dismiss or ignore accurate warnings about network vulnerabilities until after catastrophes actually occur. So Ferguson contends that we need to make preparedness politically rewarding. Political systems typically reward visible, local actions with clear and immediate benefits. But making networks more resilient provides broad, long-term benefits that are spread across society and hard for individual politicians to claim credit for. Ferguson argues that regular public reporting on vulnerability assessments and preparedness capabilities could let politicians compete on disaster readiness, rather than just disaster response.

He also contends that institutions need to build the capacity for rapid network changes: Disasters that strike modern networks often require faster responses than normal institutional processes allow. Black swan events feel surprising partly because the time needed for institutional deliberation exceeds the time available for effective intervention. Ferguson argues that rather than trying to predict specific threats, institutions should focus on building the capacity to make rapid changes when early warning signals emerge. This requires pre-positioned authority for network disruption and coordination, similar to how financial markets have circuit breakers that can halt trading when volatility exceeds safe thresholds.

To prevent black swans, Ferguson also recommends that we protect diverse expert opinions. Groupthink consistently leads institutions to dismiss evidence that contradicts prevailing models. But preventing black swan disasters requires maintaining expert networks that can challenge institutional orthodoxy without being marginalized. This means protecting space for dissenting expert opinions and preserving ways to integrate conflicting perspectives rather than eliminating disagreement through conformity pressure.

Coordinate Effectively to Prevent Dragon Kings

Dragon king disasters happen when communication breaks down between different parts of the government and society, creating cascade effects that overwhelm everyone’s ability to respond. Ferguson explains that establishing clear coordination procedures before crises occur is essential. The most dangerous breakdowns occur where different groups meet—engineers and politicians, city and federal officials, government agencies and private companies. So, officials should identify these critical connection points and establish protocols for communicating and coordinating effectively ahead of time, rather than trying to improvise during emergencies.

Ferguson also recommends distributed coordination, not centralized control. When one central authority tries to manage everything, it creates a single point of failure and can’t adapt quickly enough to fast-moving threats. Instead, effective coordination means giving different groups the authority to make decisions while staying in touch with each other. This approach preserves democratic accountability without surveillance risks. Democratic governments generally outperform authoritarian ones in disaster management because they can learn from both successes and failures. But these advantages disappear if surveillance systems eliminate the diversity of approaches that makes learning possible.

This is why Ferguson warns that many proposed solutions—such as comprehensive surveillance or global coordination systems—could create centralized control hubs that are more dangerous than the disasters they’re designed to prevent. Instead of unified systems that eliminate alternatives, the goal should be building resilience by letting different institutions learn from each other’s experiences. Disaster preparedness is less about predicting specific catastrophes than building institutions that can respond to changing conditions. This helps preserve the diversity and competition necessary for continued learning and innovation.

Niall Ferguson’s Doom: The Politics of Catastrophe (Overview)

Elizabeth Whitworth

Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books—and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.

Leave a Reply

Your email address will not be published. Required fields are marked *