Design Guidelines: Minimizing Human Error

This article is an excerpt from the Shortform book guide to "The Design of Everyday Things" by Don Norman. Shortform has the world's best summaries and analyses of books you should be reading.

Like this article? Sign up for a free trial here.

Are there design guidelines you can follow? Can these guidelines help you minimize design errors?

While all designs have to be carefully considered, there are certain design guidelines that can help. Using these guidelines is important since they can help you avoid potentially dangerous errors.

Read more about design guidelines and how they can help minimize errors and improve designs.

Specific Guidelines for Designers

How can we create products and services that minimize errors, especially dangerous ones? By following these design guidelines. Norman suggests some concrete ways:

  • Understand the factors that contribute to errors and identify which ones are possible to control. 
  • Make the system resilient. A certain degree of human error is unavoidable, but a single error should not be able to cause a chain reaction that leads to disaster. 
  • Make the “undo” function easy to access and available at every possible step.
    • For actions that cannot be undone, require multiple confirmations that the user wants to proceed with that action. This should be paired with a clear image of the specific item being acted on (to prevent such mistakes as deleting the wrong file). 
  • Use error messages to present users with guidance on how to fix the problem.
  • Remember that most errors don’t require completely starting over. Make it easier for users to fix a single step instead of the entire chain of actions.
  • Use constraints to prevent incorrect actions.
  • Incorporate “sensibility checks.” 
    • Machines follow commands as long as they are in the correct format, regardless of whether they make logical sense. This leads to errors and even tragedy, particularly in medical environments (for example, an x-ray machine that is not error-proofed will allow a technician to mistakenly enter a command for a lethal dose of radiation). Error-proof technology is pre-programmed to understand the limits of each value, and will alert when a value is entered outside the acceptable range. 
  • Use frequent, effective feedback to prevent slips (for example, in hospitals, both prescription labels and patient ID bracelets are scanned before administering any medicine to ensure that the right person is getting the right drug). 

How Can Designers Minimize Errors Using Design Guidelines?

Although it’s not possible to eliminate errors completely, thoughtful design can reduce the frequency or severity of those errors. The first step in that process is detecting when and where errors occur. As you consider type of errors, think about the design guidelines above.

Detecting Error

A truly error-proof design makes it easy to detect errors before they become dangerous. Doing this requires understanding how we notice errors in the first place, and more importantly, why we sometimes fail to notice them even when following design guidelines.

In general, slips are easier to detect than mistakes. Detecting simple errors like action-based slips is typically easy—if you accidentally put your keys in the freezer, you’re likely to realize it pretty quickly. Memory-lapse slips are harder to detect until something cues retrieval of the memory (for example, not realizing you left your wallet at home until you need to pay for gas). 

Mistakes are difficult to detect because they are conscious choices. By definition, we usually don’t recognize mistakes right away because we genuinely believe we’re making the right choice as we’re making it. Mistakes only become apparent later, when something goes wrong and the cause is traced back to the original mistake.

One reason we don’t catch mistakes earlier is the natural human tendency to explain away minor deviations from the norm. The author tells a story of driving with his family to a ski resort in California and passing several billboards for Las Vegas hotels. The family agreed that advertising on billboards located hours away from Las Vegas must be an odd marketing strategy and carried on with their journey, not realizing until two hours later that they’d missed a turn and were mistakenly headed straight toward Las Vegas. We’re much more likely to notice novel information in our environment, but once we have an explanation, it’s no longer novel. This explains why the author’s family was able to ignore all the other Las Vegas advertisements they passed before finally realizing their mistake. 

In the aftermath of an accident, the chain of events leading up to it often seems obvious. This is the power of hindsight bias, or the tendency to estimate our ability to have predicted a certain outcome before it happened. We wonder how anyone could have missed the signs of an important error when they seem so obvious—in reality, without the benefit of hindsight, we’d most likely have missed them too. 

One way to improve error detection with design guidelines and UX guidelines is with checklists. Checklists are helpful tools, but they need to be designed with social influences in mind. Having multiple people run through checklists helps in error proofing, but this should always take the form of two people working simultaneously, not sequentially. Having one person run through a checklist now and another person double check things later can actually lead to more errors, since there is a tendency to let things slide, knowing someone else is likely to catch the mistake later. But when everyone takes this attitude, errors quickly add up. (Shortform note: To learn how to correctly use checklists, read our summary of The Checklist Manifesto.)

Understanding How Accidents Happen

There is rarely only one cause of an accident, even when following UX guidelines. More frequently, accidents are the result of a number of conditions lining up in a particular way. James Reason, an accident researcher, calls this “the Swiss cheese model.” Think of each slice of cheese as a condition affecting a certain task (for example, weather). The holes in each slice are all the possible configurations of that condition (in the weather example, this would mean a hole for rain, a hole for snow, a hole for bright sun, and so on). 

For accidents and errors to occur, the holes in several slices have to line up perfectly. If the hole in any one slice doesn’t line up, the event can’t happen. 

In a car accident, for example, the four slices above might represent weather, alertness of the driver, condition of the brakes, and speed. If the holes line up perfectly—if it’s raining, the driver is sleep deprived, the brakes are worn out, and the driver is speeding—an accident is likely. But if any of those holes didn’t line up (like if the driver were more alert or the brakes were new), the accident could likely have been avoided. 

To prevent accidents, we need to prevent the holes lining up. There are three main ways to do this:

  • First, create more conditions that must be met in order for a product to function properly (this adds more slices of cheese, making it statistically less likely that holes will line up).
  • Second, error-proof each condition as much as possible (this reduces the number or size of holes in each slice). 
  • Third, include feedback at every stage (this alerts the user if certain holes are beginning to line up, giving them a chance to stop the process before an accident occurs). 

These design guidelines will not only help you create better designs, but can help you prevent damaging and potentially dangerous errors. Make sure you consider all UX guidelines when creating new designs.

Design Guidelines: Minimizing Human Error

———End of Preview———

Like what you just read? Read the rest of the world's best book summary and analysis of Don Norman's "The Design of Everyday Things" at Shortform.

Here's what you'll find in our full The Design of Everyday Things summary:

  • How psychology plays a part in the design of objects you encounter daily
  • Why pushing a door that was meant to be pulled isn't your fault
  • How bad design leads to more human errors

Carrie Cabral

Carrie has been reading and writing for as long as she can remember, and has always been open to reading anything put in front of her. She wrote her first short story at the age of six, about a lost dog who meets animal friends on his journey home. Surprisingly, it was never picked up by any major publishers, but did spark her passion for books. Carrie worked in book publishing for several years before getting an MFA in Creative Writing. She especially loves literary fiction, historical fiction, and social, cultural, and historical nonfiction that gets into the weeds of daily life.

Leave a Reply

Your email address will not be published. Required fields are marked *

Books in 10 Min?

Watch our free book summaries on YouTube. Here's Atomic Habits 👇