System Limits: Why They Happen + How to Stop Them

This article is an excerpt from the Shortform summary of "Thinking in Systems" by Donella H. Meadows. Shortform has the world's best summaries of books you should be reading.

Like this article? Sign up for a free trial here.

What are system limits? What types of system limits are there, and how can we overcome these limits to build better systems?

System limits are our own blind spots regarding systems. Often, we limit ourselves by not using the system to its full potential.

Read more about system limits and how to overcome them.

Understanding System Limits

We try to understand systems to predict their behavior and know how best to change them. However, we’re often surprised by how differently a system behaves than we expected. Systems thinking is counter-intuitive in many ways, even for trained systems thinkers. 

At the core of this confusion is our limitation in comprehension. Our brains prefer simplicity and can only handle so much complexity. That prevents us from seeing things as they really are. 

This chapter discusses a collection of such system limits. The underlying themes are:

  • Our cognitive biases color how we take in information.
  • We tend to focus on obvious points, ignoring the more subtle and complex drivers that really matter.
  • Systems often behave in ways that we’re not used to, such as changing nonlinearly or inducing delays.

Limitation #1: Focusing on Events

When we try to understand systems, we tend to see them as a series of events. 

  • History is presented as a series of factual events, such as presidential elections, wars, and treaties.
  • The news reports on stock market movements with great earnestness.

While events are entertaining, they’re not useful for understanding the system. Events are merely the output of a system, and are often the system’s most visible aspects. But events often don’t provide clarity into how the system works or the system structure, which means they don’t help you predict how the system will behave in the future or tell you how to change the system.

In fact, an event-driven view of the world is entertaining precisely because it has no predictive value. If people could merely study daily events and predict the future, the news would lose its novelty and stop being fascinating.

Limitation #2: Ignoring Nonlinearities

In our daily lives, we see the world act linearly. This is one of our system limits. There is a straight-line relationship between two things. For example:

  • An object that is twice as heavy to move requires twice as hard a push to move it.
  • If you earn a salary, your bank account increases the same amount each month you work.
  • If it takes one hour to read a chapter of a book, it takes two hours to read two chapters.

However, much of the world doesn’t act linearly. Two elements can’t be related on a straight line. For example:

  • When a new product is released, a small amount of advertising may pique your interest. In contrast, too much aggressive advertising makes you annoyed at the ad, and you actively avoid buying the product.
  • On an empty highway, a single car can move at the speed limit without any problems. Add a few more cars, and the average speed doesn’t change much. This continues over a wide range of car density. But at a certain point, adding more cars causes traffic and slows down the average speed considerably. Add a few more cars, and traffic can come to a total stop.
  • A viral infection can simmer in a low number of cases for some time, then explode exponentially.

Nonlinearities often exist as a result of feedback loops. As we learned, a reinforcing feedback loop can lead to exponential growth. Even more complex nonlinearities can result when an action changes the relative strength of feedback loops in the system—thus, the system flips from one pattern of behavior to another.

Limitation #3: Oversimplifying System Boundaries

When we studied stock-and-flow diagrams above, we represented the inflows and outflows as clouds. These mark the boundaries of the system we’re studying.

We cap systems to simplify them. This is partially because we can only tolerate so much complexity, but also because too many extra details can obscure the main question. When thinking about a fishing population, it would be confusing to have to model the entire world economy and predict how a football team’s performance might work its way down to fishing behavior.

However, we tend to oversimplify systems. We draw narrow boundaries that cause us to ignore elements that materially affect the system. This can cause the system to behave against our expectations, because we had an incomplete model of the system. For example:

  • A highway designer that ignores how people settle in a city will build a highway of a certain capacity. She may ignore that people tend to settle along a highway, which now makes it insufficient for the traffic to support.
  • A car manufacturer might only concern itself with buying parts from its supplier, ignoring how the supplier itself operates and gets its materials. If a global aluminum shortage happened, the car manufacturer would be unpleasantly surprised.

We also tend to draw natural boundaries that are irrelevant to the problem at hand. We draw boundaries between nations, between rich and poor, between age groups, between management and workers, when in reality we want happiness and prosperity for all. These boundaries can distort our view of the system.

Limitation #4: Ignoring System Limits

In this world, nothing physical can grow without limit. At some point, something will constrain that growth. However, we naturally tend to avoid thinking about limits, especially when they involve our hopes and aspirations. 

Consider a typical company that sells a product. At all points, it depends on a variety of inputs:

  • Money
  • Labor
  • Raw materials
  • Machines to produce its product
  • Marketing and customer demand
  • Order fulfillment systems

At any point, one of these factors can limit the growth. No matter how much you supply of the other inputs, the output stays the same; the limit is constraining the growth of the system. For example, the company may have more than enough labor and machines to produce the product, but it might lack the raw materials to make full use of the other inputs. (Shortform note: This limit is often called a “bottleneck.”)

Limitation #5: Ignoring Delays

As we saw in the car dealership example, delays happen at each step of a flow:

  • Delays in perceiving information
  • Delays in reacting to information
  • Delays for the reaction to cause a change

We tend to underestimate delays as a habit. You’ve had personal experience with this when a product took far longer than you expected, even if you tried to anticipate that delay.

Delays can cause surprising behavior with significant ramifications for society:

  • In understanding a disease, there is a delay between being exposed to a disease and showing symptoms. (Shortform note: This was relevant during the COVID-19 pandemic in 2020, when a long incubation period caused undetectable transmission.)
  • In the growth of an economy, there is a delay between pollution being emitted, when the pollution concentrates enough to cause noticeable problems, and a delay to perceiving these problems. Then there is a delay to changing the system to reduce pollution.
  • In policy, there is a delay between signing a new law and it having an impact on the economy. This can take years or decades.

Delays cause us to make decisions with imperfect information. We may be perceiving information that is too old. In turn, our actions may overshoot or undershoot what the system needs. Like the manager of the car lot, this can lead to oscillations.

Limitation #6: Bounded Rationality

We can only make decisions with the information we comprehend accurately. This is one of our most significant system limits.

First, sometimes the information just doesn’t exist. Especially if it’s far removed from us.

  • We don’t know what others will do.
  • We don’t foresee how our actions will affect the system.

Second, even if we had total information, we’re limited in the information we can take in and digest.

Third, even the information we can take in is biased. 

  • We overweigh the present and discount the past, we are biased to information that confirms our beliefs.
  • (Shortform note: Read more about cognitive biases in Thinking, Fast and Slow and Influence.)

It’s small wonder then that we can act with good intentions but cause bad results.

  • A fisherman is thinking about the loan on his boat, his newborn child, the risk of injury that might jeopardize his career. He doesn’t have information about the global stock of fish. So he tends to overfish.
System Limits: Why They Happen + How to Stop Them

———End of Preview———

Like what you just read? Read the rest of the world's best summary of Donella H. Meadows's "Thinking in Systems" at Shortform.

Here's what you'll find in our full Thinking in Systems summary:

  • How the world, from bathtub faucets to fish populations, can be seen as simple systems
  • The key system traps that hold back progress, such as escalating arms races and policy addiction
  • Why seeing the world as systems can give you superpowers in work and life

Leave a Reply

Your email address will not be published. Required fields are marked *