Thinking in Systems
by Donella H Meadows
Read Status: Completed 📕
Last Updated : 26 Sep 2021
12 min read ☕️
We live in a world of complex systems. We are complex systems. System thinking offer an alternate way of thinking compared to reductionist thinking that is favored in scientifc studies. There are 3 reasons why systems work well, 8 system traps & opportunities, 12 leverage points to intervene and 15 general systems wisdoms.
In short, this book is poised on a duality. We know a tremendous amount about how the world works, but not nearly enough. Our knowledge is amazing; our ignorance even more so.
System thinking is an important skill set that software engineers should pick up. It helps us see the big picture and appreciate the interactions and relationships between different complex pieces in a software system. This book is one of the highly recommended books to deep-dive into thinking in systems.
Given that the author's background is in environmental science, the environmental and economic examples used in the book made me reminisce about my university days when I took a module on environmental earth systems science.
The current Covid-19 endemic and the various government responses to this health crisis serve as a good case study for system thinking.
" As one who advocates for changes in how a society (or a family) functions, you may see years of progress easily undone in a few swift reactions."
Diana Wright, the editor believes that this book can help us better understand the real world systems around us, appreciate the changes and finally, learn to manage and redesign systems for the greater good.
"If a factory is torn down but the rationality which produced it is left standing, then that rationality will simply produce another factory. If a revolution destroys a government, but the systematic patterns of thought that produced that government are left intact, then those patterns will repeat themselves. . . . There’s so much talk about the system. And so little understanding.""
- Robert Pirsig, Zen and the Art of Motorcycle Maintenance
Once we see the relationship between structure and behavior, we can begin to understand how systems work, what makes them produce poor results, and how to shift them into better behavior patterns ... It is a way of thinking that gives us the freedom to identify root causes of problems and see new opportunities.
What is a system? A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time. The system may be buffeted, constricted, triggered, or driven by outside forces. But the system’s response to these forces is characteristic of itself, and that response is seldom simple in the real world ...
Systems can change, adapt, respond to events, seek goals, mend injuries, and attend to their own survival in lifelike ways, although they may contain or consist of nonliving things. Systems can be self-organizing, and often are self-repairing over at least some range of disruptions ...
Is there anything that is not a system? Yes—a conglomeration without any particular interconnections or function. Sand scattered on a road by happenstance is not, itself, a system. You can add sand or take away sand and you still have just sand on the road. Arbitrarily add or take away football players, or pieces of your digestive system, and you quickly no longer have the same system.
A key concept to define system. A system must consist of three kinds of things: elements, interconnections, and a function or purpose.
The system, to a large extent, causes its own behavior! An outside event may unleash that behavior, but the same outside event applied to a different system is likely to produce a different result.
Words and sentences must, by necessity, come only one at a time in linear, logical order. Systems happen all at once. They are connected not just in one direction, but in many directions simultaneously ... Pictures work for this language better than words.
A key consideration when trying to explain your system design to others.
"I don’t think the systems way of seeing is better than the reductionist way of thinking. I think it’s complementary, and therefore revealing."
The systems-thinking lens allows us to reclaim our intuition about whole systems and hone our abilities to understand parts, see interconnections, ask “what-if ” questions about possible future behaviors, and be creative and courageous about system redesign.
The behavior of a system cannot be known just by knowing the elements of which the system is made.
The lesson from the parable of the blind men and an elephant.
Many of the interconnections in systems operate through the flow of information (formal or informal).
For example, students in the university may use official course information and other informal information from peers/forums to decide which courses to take.
The best way to deduce the system’s purpose is to watch for a while to see how the system behaves ... If a government proclaims its interest in protecting the environment but allocates little money or effort toward that goal, environmental protection is not, in fact, the government’s purpose. Purposes are deduced from behavior, not from rhetoric or stated goals.
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
There can be systems within systems, purposes with purposes. For example, the purpose of an university is to discover and transfer knowledge. Different actors within the systems may have different purpose. A student's purpose may be to get good grade, the purpose of a professor may be to get tenure and so on.
Changing elements usually has the least effect on the system ... Changing interconnections in a system can change it dramatically. Changes in function or purpose also can be drastic.
For example, change the players in a school soccer team. It is still the same school team. Change the rules of the soccer game to those of basketball, even with the same players, you will get a new ball game. Similarly, keep the players and rules but change the purpose from winning to losing, the change is drastic.
A stock is the foundation of any system. Stocks are the elements of the system that you can see, feel, count, or measure at any given time. A system stock is just what it sounds like: a store, a quantity, an accumulation of material or information that has built up over time ... A stock is the memory of the history of changing flows within the system.
If you understand the dynamics of stocks and flows - their behavior over time—you understand a good deal about the behavior of complex systems.
Systems thinkers use graphs of system behavior to understand trends over time, rather than focusing attention on individual events.
The human mind seems to focus more easily on stocks than on flows. When we do focus on flows, we tend to focus on inflows more easily than on outflows. A company can build up a larger workforce by more hiring, or it can do the same thing by reducing the rates of quitting and firing. These two strategies may have very different costs.
A stock takes time to change, because flows take time to flow. Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems. Industrialization cannot proceed faster than the rate at which factories and machines can be constructed and the rate at which human beings can be educated to run and maintain them.
The time lags provides stability and opportunities to experiment with different approaches.
Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows. That means system thinkers see the world as a collection of “feedback processes.”
For example, if the stock of food in your kitchen runs low, you make decisions and take actions to increase the stock by ordering online or buying from the supermarket. This is a kind of direct and balancing feedback loop that stabilizes the stock level. The stock level may not be fixed, but it stays within an acceptable range.
There are two types of feedback loops; balancing feedback loops and reinforcing feedback loops. As illustrated from the example above, balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change. Reinforcing feedback loops, in comparsion, are self-enhancing, leading to exponential growth or to runaway collapses over time. The more soil is eroded from the land, the less plants are able to grow, so the fewer roots there are to hold the soil, so the more soil is eroded, so less plants can grow.
If A causes B, is it possible that B also causes A? ... A single stock is likely to have several reinforcing and balancing loops of differing strengths pulling it in several directions.
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback.
Your mental model of the system needs to include all the important flows, or you will be surprised by the system’s behavior. If you’re gearing up your work force to a higher level, you have to hire fast enough to correct for those who quit while you are hiring.
Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior. At that point neither loop dominates, and we have dynamic equilibrium.
For example, when the fertility rate is higher than mortality rate in a population, the resulting behaviour is exponential growth.
Dynamic systems studies are designed to explore what would happen, if a number of driving factors unfold in a range of different ways. The question on whether the system really will react this way—is more scientific. It’s a question about how good the model is.
Nonrenewable resources are stock-limited. Renewable resources are flow-limited. Renewable resources can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate.
Three characteristics of systems: resilience, self-organization, hierarchy
A rich Structure of feedback loops gives the system its resilience. Meta-resilience arises from a set of feedback loops that can restore feedback loops. Meta-meta-resilience comes from feedback loops that can self-organize, learn and evolve more complex restorative structures. A good example is the human body. However, there are always limits to resilience. Large organizations may lose their resilience if there are too many delays and distortion in its feedback mechanism.
Loss of resilience can come as a surprise, because the system usually is paying much more attention to its play than to its playing space. One day it does something it has done a hundred times before and crashes. Systems need to be managed not only for productivity or stability, they also need to be managed for resilience.
Self-organization produces heterogeneity and unpredictability. It is likely come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder.
Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability. Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes. ... Or for establishing bureaucracies and theories of knowledge that treat people as if they were only numbers.
Complex forms of self-organization may arise from relatively simple organizing rules—or may not.
Example: the formation of the Koch snowflake.
The reductionist dissection of regular science teaches us a lot. However, one should not lose sight of the important relationships that each subsystem to the others and to the higher levels of the hierarchy, or one will be in for surprises.
The original purpose of a hierarchy is always to help its originating subsystems do their jobs better. This is something, unfortunately, that both the higher and the lower levels of a greatly articulated hierarchy easily can forget. Therefore, many systems are not meeting our goals because of malfunctioning hierarchies.
When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.
We are able to create effective mental models to understand the world better, but always remember our models cannot represent the world fully. Be wary of our assumptions and bias.
That’s one reason why systems of all kinds surprise us. We are too fascinated by the events they generate. We pay too little attention to their history. And we are insufficiently skilled at seeing in their history clues to the structures from which behavior and events flow.
For example, we found the equation of how the temperature varies based on heat flows (the behavior), but we don't know how thermostats work (the structure), we can predict the future temperature, assuming nothing changes in the room. This behavior-level analysis cannot help us if we want to reduce the bill or replicate the same behavior in another room.
The world is full of nonlinearities. So the world often surprises our linear-thinking minds. If we’ve learned that a small push produces a small response, we think that twice as big a push will produce twice as big a response. But in a nonlinear system, twice the push could produce one-sixth the response, or the response squared, or no response at all.
There is no single, legitimate boundary to draw around a system. We have to invent boundaries for clarity and sanity; and boundaries can produce problems when we forget that we’ve artificially created them ... Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.
At any given time, the input that is most important to a system is the one that is most limiting ... One of the classic models taught to systems students at MIT is Jay Forrester’s corporate-growth model. It starts with a successful young company, growing rapidly. The problem for this company is to recognize deal with its shifting limits—limits that change in response to the company’s own growth ... Insight comes not only from recognizing which factor is limiting, but from seeing that growth itself depletes or enhances limits and therefore changes what is limiting.
There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
For example, if the government do not enforce their limits to growth, the environment will choose and enforce limits.
Jay Forrester used to tell us, when we were modeling a construction or processing delay, to ask everyone in the system how long they thought the delay was, make our best guess, and then multiply by three.
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system ... Fishermen overfish and destroy their own livelihood. Corporations collectively make investment decisions that cause business-cycle downturns. Poor people have more babies than they can support.
System traps, or archtypes include policy resistance ("fixes that fail"), tragedy of the commons, drift to low performance ("eroding goals"/"boiled frog syndrome"), escalation, competitive exclusion ("success to the successful"), addiction ("shifting the burden to the intervenor"), rule beating, seeking the wrong goal.
System traps also serves as opportunities. For example, rule beating can serve as useful feedback to design better rules in the future.
People often know where to find leverage points to create large shift in behavior. However, they often push it in the wrong direction.
- Transcending Paradigms
- Information Flows
- Reinforcing Feedback Loops
- Balancing Feedback Loops
- Stock and Flow
“You’re acting as though there is a fine line at which the rent is fair, and at any point above that point the tenant is being screwed and at any point below that you are being screwed. In fact, there is a large gray area in which both you and the tenant are getting a good, or at least a fair, deal. Stop worrying and get on with your life.”
Contrary to economic opinion, the price of fish doesn’t provide that feedback. As the fish get more scarce they become more expensive, and it becomes all the more profitable to go out and catch the last few. That’s a perverse feedback, a reinforcing loop that leads to collapse. It is not price information but population information that is needed.
On the problem of overfishing. Improving the information flows of the right set of information is important.
Even people within systems don’t often recognize what whole-system goal they are serving. “To make profits,” most corporations would say, but that’s just a rule, a necessary condition to stay in the game. What is the point of the game? To grow, to increase market share, to bring the world (customers, suppliers, regulators) more and more under the control of the corporation, so that its operations becomes ever more shielded from uncertainty.
The ancient Egyptians built pyramids because they believed in an afterlife. We build skyscrapers because we believe that space in downtown cities is enormously valuable ... people who have managed to intervene in systems at the level of paradigm have hit a leverage point that totally transforms systems ... All it takes is a click in the mind, a falling of scales from the eyes, a new way of seeing.
So how do you change paradigms? ... You keep pointing at the anomalies and failures in the old paradigm. You keep speaking and acting, loudly and with assurance, from the new one. You insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather, you work with active change agents and with the vast middle ground of people who are open-minded.
Finally, one should not be fixated about any paradigms, as the author puts it, "there is no certainty in any worldview ... In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system."
In the final chapter, the author offers several advice on living in a world of systems.
- Get the beat of the system
- Expose our mental models and assumptions for discussion
- Avoid delay or withhold information flows
- Explain clearly. Avoid language pollution
- Pay attention to what is important and not just what is easy to quantify.
- Make feedback policies for feedback systems
- Aim to enhance total systems properties
- Pay attention to existing value in the system before rushing to fix things
- Look for ways the system creates its own behavior
- Be a lifelong learner
- Celebrate Complexity
- Expand time horizons, thought horizons
- Defy the disciplines
- Expand the boundary of caring
- Don't erode the goal of goodness
Systems thinking for us was more than subtle, complicated mind play. It was going to make systems work.
It is one thing to understand how to fix a system, it is another to know how to implement, and how to get people together to fix a system.
We gave learned lectures on the structure of addiction and could not give up coffee. We knew all about the dynamics of eroding goals and eroded our own jogging programs. We warned against the traps of escalation and shifting the burden and then created them in our own marriages.