This article is an excerpt from the Shortform book guide to "Black Box Thinking" by Matthew Syed. Shortform has the world's best summaries and analyses of books you should be reading.

Like this article? Sign up for a free trial here .

Is healthcare culture failure-averse? When mistakes and failures occur, should they be acknowledged and treated as learning opportunities?

In Black Box Thinking, the author talks about how certain industries progress from failures. For example, the airline industries carefully track and learn from their mistakes while our healthcare culture has an aversion to failure.

Keep reading to learn why failure aversion in healthcare culture stunts growth in the healthcare system.

Case Study: Failure Aversion in the Health Care System

Syed argues that in the healthcare culture, hospitals have numerous opportunities to learn from failure but often disregard them. This is because they operate according to top-down systems that go unquestioned, and healthcare cultures stigmatize failure.

Problem #1: Healthcare Cultures Avoid Acknowledging Mistakes 

Doctors, nurses, and other health care professionals train for years, so they expect perfection from themselves and their peers. Mistakes carry a strong stigma—if you mess up as a surgeon or nurse, you’ll be looked down upon—and the more senior your role, the stronger the blame.

Because of this, Syed argues, many health care workers fear reporting mistakes. Reporting their own errors leads to consequences, and reporting their superiors might provoke retaliation. As we explained above, cognitive dissonance compounds the problem: A doctor can unwittingly suppress his memory of a failure, justify the mistakes, or outright deny that it happened. By doing so, he preserves his reputation at the expense of continued patient harm. 

(Shortform note: In a January 2022 article, emergency room nurse Sally Ersun details her gut-wrenching last day on the job, highlighting the systemic issues Syed refers to. Understaffed and overburdened, her department had too few supplies and could not provide blood for a dying man; another patient was left unattended and fell and hit her head—and there was no time to report the incident. Ersun explains that she’s been “threatened” by superiors, and ultimately quit due to physical and emotional exhaustion. A nurse for 10 years, she argues that health care’s for-profit model “prioritizes finances over lives,” which stoutly corroborates Syed’s analysis of health care.) 

Problem #2: Healthcare Systems Don’t Analyze Failures 

Many hospitals also lack systems for reporting, investigating, and improving upon errors. Syed cites a report showing that fewer than 20 US states require error reporting mechanisms in hospitals. Of those 20, few consistently investigate errors and enforce changes. Another study found that of 273 hospitalizations, hospitals “missed or ignored” 93% of preventable errors.

In healthcare culture, investigating errors simply isn’t the convention. Because of this, numerous mistakes—including hundreds of thousands of preventable deaths annually—go unexamined. Many deaths from surgery, medication error, and neglect are written off as “inevitable” or “one-off” tragedies.

(Shortform note: One key to effective investigations is to work with an independent investigator. In 2016, a National Health Service investigation demonstrated the need for this, showing that hospitals consistently treated family members of the deceased with little courtesy and often ignored or blocked their requests for information. The Care Quality Commission (CQC), a care watchdog based in England, is attempting to establish an independent review process to enforce accountability. Their first goal is to secure better treatment for the families, which the CQC determined hospital workers view as “antagonistic.”) 

Since they don’t learn from their mistakes, hospitals also lack “institutional memory,” a shared compendium of lessons learned. Without this, the few lessons learned take years, even decades, to percolate through the broader health care system.

The infrequency of autopsies exemplifies the problem, according to Syed:

  • Autopsies are health care’s version of the black boxes used in aviation: They enable doctors to look clearly at the precise, objective details of what went wrong. 
  • Close to 80% of families give permission to perform the autopsy. Despite this, almost none are performed—fewer than 10%
  • Each autopsy is an opportunity for doctors to learn what went wrong and improve their processes. So each time they’re passed up, potentially life-saving learnings go down the drain.
  • Syed argues that this happens because of healthcare culture: The doctor fears the shame of knowing his failure, so he avoids ordering autopsies. This preserves his self-image and his reputation as a consummate professional, at the cost of important insights that could improve his systems.

(Shortform note: As recently as the 1950s, US hospitals performed autopsies on around 50% of all deaths. Virtually all medical experts agree that autopsies are invaluable for determining the cause of a death, yet they’re expensive, time-consuming, and must be performed at the hospital’s expense. Modern physicians argue that advanced imaging technologies and budgetary concerns render autopsies unnecessary, though one study of autopsies performed to confirm clinical diagnoses found a median error rate of 23.5%, showing that they remain relevant for determining the cause of death and for learning from what happened.)

Healthcare Culture: Should Failure Be Acknowledged?

———End of Preview———

Like what you just read? Read the rest of the world's best book summary and analysis of Matthew Syed's "Black Box Thinking" at Shortform .

Here's what you'll find in our full Black Box Thinking summary :

  • How an organization’s culture and systems either promote or prevent learning
  • The steps for learning from failure in our complex world
  • How to shift mindsets around failure to promote a learning-oriented institution

Leave a Reply

Your email address will not be published.