← /notes

Entropy

Created Dec 23, 2024 physicssystemsthermodynamics

The Second Law of Thermodynamics: in any closed system, entropy never decreases. Disorder accumulates. Energy disperses. Heat flows from hot to cold, never the reverse. This isn’t a tendency or a probability — it’s a statistical certainty so overwhelming that violations are never observed.

Entropy explains why time has a direction. You can scramble an egg but not unscramble it. A cup can shatter but shards don’t reassemble. The past differs from the future because entropy was lower then. The arrow of time is the arrow of increasing disorder.


Living systems seem to violate the Second Law — they build order from chaos, complexity from simplicity. But they don’t. They’re not closed systems. They import low-entropy energy (sunlight, food) and export high-entropy waste (heat, excrement). Local order is purchased by increasing disorder elsewhere. The universe as a whole still moves toward equilibrium.

This has profound implications for civilization. Every economy, every city, every organism is a dissipative structure — it maintains itself only by consuming ordered energy and producing waste. There is no perpetual motion, no free lunch, no sustainable growth without continuous input. antifragility can’t escape thermodynamics.


The concept extends beyond physics. Information entropy (Shannon entropy) measures uncertainty — how many bits you need to specify a message. A random string has high entropy; a predictable pattern has low entropy. compression works by exploiting low entropy; you can’t compress randomness.

Social systems have entropy too. Organizations accumulate cruft, procedures, exceptions. Relationships require maintenance or they decay. Gardens return to weeds. The default state of everything is disorder. Order requires work, continuously applied.

Related: systems, maintenance, feedback loops, antifragility, deep time