Feedback Loops
A feedback loop exists when a system’s output circles back to influence its input. The thermostat measures temperature (output of the heating system), then adjusts heating (input to the system). The loop closes.
This simple structure — output becomes input — generates most of the dynamics that matter: stability, oscillation, runaway growth, collapse. Understanding feedback is understanding how systems behave over time.
Negative feedback opposes change. The system detects deviation from a target and pushes back. Thermostat detects cold, activates heat, temperature rises, thermostat detects warmth, deactivates heat. The loop stabilizes around a setpoint.
Biology runs on negative feedback. Blood sugar rises; pancreas releases insulin; cells absorb glucose; blood sugar falls. Body temperature rises; blood vessels dilate; sweating begins; temperature falls. The technical term is homeostasis — maintaining stability through continuous correction.
Markets use negative feedback. Price rises signal scarcity; producers increase supply; supply growth pushes price down. Price falls signal surplus; producers cut back; scarcity returns. The price mechanism is a thermostat for allocation.
Negative feedback creates stability but resists change. Systems with strong negative feedback return to equilibrium after perturbation. Push them, they push back.
Positive feedback amplifies change. The system detects a signal and strengthens it. Small deviations grow into large ones. The loop runs away.
Compound interest is positive feedback. Earnings generate more capital; more capital generates more earnings. The rich get richer because wealth feeds itself.
Bank runs are positive feedback. One depositor withdraws, signaling danger; others withdraw; the signal strengthens; everyone withdraws. The fear creates what it fears.
Network effects are positive feedback. More users make a platform more valuable; value attracts more users; the platform grows until it dominates. Facebook, Uber, and Airbnb won through positive feedback loops that competitors couldn’t ignite.
Arms races, viral spread, speculative bubbles — all positive feedback. Small initial advantages compound until they dominate. Winners take all because winning feeds winning.
Most real systems combine both. Positive feedback drives growth; negative feedback constrains it. The interplay creates complex dynamics.
Population growth is positive feedback (more people → more births). Resource limits are negative feedback (more people → less food per person → more deaths). The interaction produces S-curves: exponential growth, then slowdown, then equilibrium — or overshoot and collapse if the positive feedback overshoots the negative feedback’s capacity to correct.
Markets combine both. Momentum trading is positive feedback: rising prices attract buyers, pushing prices higher. Value investing is negative feedback: high prices attract sellers, pushing prices down. When positive feedback dominates, you get bubbles. When negative feedback dominates, you get mean reversion. Which dominates depends on timescale, participants, and psychology.
Delay makes feedback dangerous. If the response comes too late, the system overshoots. The thermostat that takes an hour to register temperature change will freeze the house, then overheat it. The oscillation comes from acting on stale information.
Economic policy is plagued by delay. The Fed raises rates; the effect takes 18 months to appear; by then conditions have changed. Policy responds to yesterday’s data with actions that won’t land until tomorrow.
Ecological systems have long delays. Carbon emitted today warms the climate for centuries. Fisheries overfished today take decades to recover — if they recover at all. The delay between cause and effect obscures responsibility and frustrates intervention.
Feedback blindness is the failure to see loops. Linear thinking assumes causes produce effects, full stop. Feedback thinking asks: and then what? How does the effect change the cause?
Resistance to antibiotics is feedback. Kill susceptible bacteria; resistant strains survive and multiply; resistance spreads; antibiotics stop working. The intervention changes the system it’s intervening in.
Goodhart’s Law is feedback blindness. The metric was a proxy for what you cared about. When you optimized the metric, you changed the system. The proxy stopped tracking reality. You optimized the output without noticing it was feeding back to corrupt the input.
Traffic planning shows feedback blindness. Build more roads to reduce congestion; reduced congestion attracts more drivers; congestion returns. This is induced demand — the intervention creates the problem it was solving.
The question for any intervention: what loops does this create? What happens when the effect feeds back into the cause? Systems don’t hold still while you act on them. They respond. The response is the feedback.
Stability comes from negative feedback that’s fast enough and strong enough. Growth comes from positive feedback that isn’t checked. Collapse comes from positive feedback that outruns negative feedback. Oscillation comes from delay.
To change a system, find the feedback loops.
Go Deeper
Books
- Thinking in Systems: A Primer by Donella Meadows — The essential introduction. Clear, practical, readable. Start here.
- The Fifth Discipline by Peter Senge — Systems thinking applied to organizations. Includes causal loop diagrams.
- Cybernetics by Norbert Wiener (1948) — The original. Dense but foundational.
- An Introduction to Cybernetics by W. Ross Ashby — More accessible than Wiener, with the Law of Requisite Variety.
Essays
- “Leverage Points: Places to Intervene in a System” by Donella Meadows — Her famous essay on where feedback interventions have most impact.
- Jay Forrester’s system dynamics work at MIT shaped the field.
Related: [[systems]], [[goodharts-law]], [[hysteresis]], [[jevons-paradox]], [[antifragility]]
In this section
- Homeostasis How living systems maintain stability through continuous correction
- Runaway Processes When positive feedback escapes control