Ruminating On The Rhythms of Reality: Systems Thinking
Unlocking intelligent, strategic problem-solving: A curious journey through systems thinking
In this edition:
What I Think You Need To Know: What does it mean to think in systems? Why does it matter?
This Should Make You Smarter: What’s systems thinking got to do with strategic thinking?
Through The Systems Thinker’s Lens: How do you start practicing the art and science?
I dedicate this edition to the memory of a towering scientist, educator, and systems thinker — Donna Meadows — the key takeaways of whose book, “Thinking in Systems,” I expound in what follows.
* * *
Systems are ubiquitous. We live in a gigantic, complex system. Our body is a system. The basket of apples and bananas, seemingly lifeless, is a live system of invisible chemical reactions. When one encounters the word ‘system’, I suspect one’s mind immediately invokes a sense of the technical. I suppose that’s due to the technological hype that, in a manner of speaking, may have unwittingly usurped the concept to apply to computer systems.
Yet, we have natural and social systems that subsume technical systems in grander scales. While technical systems are arguably easier to understand (since humans build them with an overall purpose), natural & social systems are difficult to unweave (since they conceal their true purpose). It is ironic, then, that humans are spendthrift with respect to interfering with such systems with such limited understanding of them. To see why, we’ll digress to the fascinating world of systems thinking, where we get some concepts straight.
What I Think You Need To Know
A simple definition will help us park any nuances as we navigate through the wondrous world of systems thinking. A system is merely a set of interconnected elements that, as a whole, have a specific purpose. Thus, a system has a structure: we see its elements and the links between them. But it also has behavior: the multiplicity of connections between elements gives rise to performance of over time, some of which may be unleashed by outside factors.
“The central insight of Systems Theory,” writes Meadows in Thinking in Systems, “is that once we see the relationship between structure and behavior, we can begin to understand how systems work, what makes them produce poor results, and how to shift them into better behavior patterns.”
Another important insight is to observe the system for a period of time to feel its beat. This helps us learn the system’s purpose, given that “purpose is deduced from behavior, not from rhetoric or stated goals.” Such observations are important because while it is easy to learn about a system’s elements (e.g., the employees of an organization) it is much harder to learn about their interconnections (e.g., the many links between employees that give rise to all sorts of patterns of behavior in the organization.
Because elements are the easiest to identify and learn about, manipulating them with the intent of changing the system is often (but not always) fruitless. The key of intervening into a systems to alter its behavior is to manipulate links between elements — that will contribute to the most significant change in the system’s purpose.
To introduce the key concepts and models from Systems Theory, the author spends a few pages digressing onto the technical (dynamical) aspect of systems. Her objective is to illustrate, via systems dynamics, the behavior of complex systems over time. As any systems thinker will attest, stocks and flows constitute a powerful tool to model systems:
stocks are system elements that increase or decrease in quantity over time due to the impact of inflows (increases) and outflows (decreases) over time
flows are rates of change that impact stocks over time
The best metaphor is that of the bathtub, which is a stock governed by two flows: the faucet (increases water stock levels in the bathtub) and the drain.
In reality, “stocks take time to change because flows are slow; thus stocks act as delays or shock absorbers,” writes Meadows. This observation is critical when studying a system because it reminds the observer that patiences pays off. “If you understand the dynamics of stocks and flows over time, you understand a great deal about the behavior of complex systems.”
Now comes the fun part — feedback loops, or the building blocks of dynamic system behavior. Feedback loops describe the cycles of cause and effect that govern the system: a change in a stock may have some sort of ripple effect throughout the system that eventually comes back to affect the flows into or out of that stock. The key of feedback loops is that a circle be formed in the chain of cause and effect.
As an example, take this analysis I conducted of how cultural performance and strategy execution success rate dance together in a reinforcing feedback loop: the higher the cultural performance, the better the team performance, the higher the maturity of related capabilities, thus the higher the chance of successful strategy execution, which increases leadership trust, which then further increases cultural performance, and so forth in cycles. This is an example of a reinforcing loop, because of the compounding change in the same direction.
On the other hand, a balancing loop produce stock changes of opposing directions. For instance, the lower the leadership trust in teams, the higher the coaching invested (hopefully), the more motivated teams become, the better their business outcomes, the higher the leadership trust, the lower the coaching required, and so forth in a balancing manner, with obviously more convoluted delays in said effects. This is why balancing loops are considered system stabilizers.
And the concept of delays is very important in systems thinking. Arguably every connection between elements has some sort of delay. This must be taken into consideration to get the narrative — or, mental model — about the system as accurate as possible.
System Traps and Archetypes
In studying systems and constructing narratives (mental models) to describe what we observe, there are some commonly-encountered patterns, referred to as system archetypes. These archetypes can help us avoid any associated traps by intervening wisely.
Meadows identifies a few archetypes and accompanies each with an intervention strategy to avoid the associated traps:
Policy Resistance. We’ve all been there. You’ve got this great policy, but all of sudden resistance points pop up. This happens when policies acting as balancing feedback loops, desirable as they may be for their stability, produce undesirable behaviors over long periods of time. Elements resist due to lack of incentives, thus having fallen prey to the “fixes that fail” trap. To mitigate this trap, find a way to align goals and incentives by communicating vision.
Tragedy of the Commons. Self-interest on some limited resource can lead to collective disadvantage. “I’ll wipe out the Amazon to produce more paper,” said Joe, who obviously is one of the many average humans reacting to events. A great way to avoid such tragedies is education — an effective feedback loop that when introduced in the system can produce desired behaviors. Regulation is another such loop.
Drift to Low Performance. How do you keep effort up, especially when you’re dealing with perceptions that far removed from desired system states introduced by policies? One way to prevent drift is to make standards mandatory and to make goals sensitive to best past performances (run those retrospectives wisely!).
Escalation. “I’ll compete with Jane,” says Joe. “I’ll compete with Joe,” says Jane. And so forth in a reinforcing loop, thus escalating the “I’ll get ahead of them” game. Obviously the smart way to avoid the shrinkage of a stock is to negotiate an equilibrium.
Competitive Exclusion. Also known as “success to the successful", this trap allows the winner of a larger share to further compound the resulting inequality. The trick is to diversify the system with feedback loops that reinforce fair game.
Addiction. “Just one more cup of coffee and I’ll stop for the day.” “Just one more night of spaghetti and I’ll go on a diet.” “Another smoke and I’ll quit.” Shifting the burden, as this trap is also known, happens when the level of a stock (well-being, self-worth, cardiac health) depends on a factor (“my need to smoke or drink coffee, my high blood pressure, etc.”). Thus, the actor reduces own symptoms by succumbing to the factor (a balancing loop), instead of opting for fixing the root cause instead of addressing the symptom (which is bound to return!).
Rule Beating. You must have run into someone who pays lip service to rules but then degrades the inherent feedback loops rules bring about by disobeying them, thus distorting the system. The trap can be mitigated by designing rule that promote creativity.
Seeking the Wrong Goal. Lack of alignment or misalignment relative to stated goals can create system behavior that digress from the system’s welfare. Because goals act as indicators of rule satisfaction, when defined inaccurately, the system will simply produce unintended behavior — it’s sensitive to goals. The key to avoid this trap is to make sure alignment is clear in specifying goals that reflect the desired behaviors.
This Should Make You Smarter
Here are some important lessons to bookmark from systems thinking:
Systems work well because of three characteristics: resilience, self-organization, hierarchical structures.
Systems exhibit dynamic behavior. Our purpose is not to predict the future, but to analyze what-if scenarios in order to see how the system responds.
Don’t be myopic by focusing on, and reacting to, short-term events. “Look for long-term behavior and structure, limiting factors and system boundaries, and nonlinearities and delays.”
Events are output by the system — the key is to get to terms with the system that gives birth to events.
That’s why the systems thinker dismisses events in favor of a deeper look into the history of the corresponding system to get clues from the behavioral structures of that system over time.
Delays between elements are ubiquitous — consider them in your analysis and synthesis because they serve as “strong determinants of behavior.” Without understanding the characteristics of delays, “we can’t begin to understand the dynamic behavior of systems.” The longer a delay, the more foresight one ought to have.
There are limits to growth. A reinforcing loop must be strengthened or weakened by some balancing loop eventually. “Any physical, growing system eventually runs into a constraint. … Whenever we see a growing system, we look for the reinforcing loops that are driving it and for the balancing loops that will ultimately constrain it.”
Here’s a trick: When foraying into a system analysis, identify structures that produce behaviors and the conditions under which they do so. This gives you the power to alter those structures so as “to reduce the probability of destructive behaviors and to encourage the possibility of beneficial ones.”
The following exhibit neatly summarizes the leverage points in the traditional iceberg model, thus publicizing the view that systems thinking digs deeper into the depths and hidden structures of the system to come up with smart intervention strategies:
Given the model above, Meadows presents a set of leverage points — “places in the system where a small change could lead to a large shift in behavior.” I enumerate them below from the least to the most effective:
Numbers. This is like reacting to events and setting limits, rewarding resources, etc. This is really the lowest leverage way to intervene to change behavior because even though a drop of a water generates some waves on a placid lake, the lake will return to that placid state shortly thereafter.
Stock and flow structures. This is when “strategists,” for instance, intervene by adjusting system elements and rates of change. “I’ll let Bob go because he’s the head of the sales department that’s not meeting targets.” Obviously Bob will go for optics because the system couldn’t care less — the deeper structures matter.
Delays. Oscillations in the system are caused by delays, which affect stock levels. Intervening in factors that delay flow to increase or decrease delays may provide some better leverage to change behavior.
Feedback Loops. Now we’re swimming deeper into the ocean. With reinforcing loops, we observe growth. Here, we may introduce sufficient delay so that the balancing loops in the system are given a chance to control growth behavior. This is hard, but smarter (read: more strategic) than the above.
Rules and Information Flows. Whoever produces rules, rules the system, as it were. “Power over rules is real power, as rules define system scope, boundaries, and freedom,” writes Meadows. As we have seen above, rules have the power to distort the system, but also to maintain desired behaviors. Furthermore, “missing information flows is the most common cause of system malfunction.'“ Thus, restoring information seems like a powerful intervention — and it is.
Self-Organization. Capitalizing on this system characteristic can provide us the ability to set up clever rules in a way that enables the system to add or subtract from itself. This encourages variability and diversity — much desired properties in systems.
System Goals. Setting additional goals besides the obvious is another high-intervention, strategic way to introduce desired behaviors into the system.
Paradigm Shifts. The fundamental restructuring of the system and the transcending of paradigms to allow for flexibility without constraints is the most strategic way to handle systems.
Through The Systems Thinker’s Lens
Systems thinking is, in fact, vital to strategic management. In particular, systems thinking boosts up strategic thinking by enhancing the panoramic or big-picture thinking capabilities. In addition to analytical thinking, which is how our brains are wired to think from grade school until we retire, systems thinking invites us to fly thousands of feet up to study the system to the best of our abilities.
Donna Meadows summarizes this wittingly as the key point of systems thinking:
Paradigms are the source of systems … from shared social agreements and ideas about the nature of reality come system goals and information flows, and system structures and elements. They way to change paradigms is by building models of systems, which take us outside the systems and force us to see it whole. The highest leverage point is transcending paradigms, remaining flexible, and not subscribing to any one paradigm.
Thus, we work on, not in, the system when developing the most effective strategies. But because we come to terms with the complexity and nonlinearity of systems, we also let strategies emerge — can’t really dodge that reality. Here’s an example.
Happy Strategizing!
Such an insightful article! 👍