← /notes

McNamara Fallacy

Created Dec 23, 2024 epistemologysystemsdecisions

Robert McNamara ran the Vietnam War by numbers: body counts, sortie rates, villages pacified. The metrics looked good; the war was being lost. The things that mattered — political legitimacy, popular support, guerrilla adaptability — weren’t being measured. The things being measured weren’t the things that mattered.

The fallacy has four steps: measure what can be easily measured, disregard what can’t, presume what can’t be measured isn’t important, presume what isn’t measured doesn’t exist. Each step is a small error; together they construct a false reality.


The fallacy thrives wherever quantification meets complexity. Schools measure test scores, not love of learning. Hospitals measure throughput, not healing. Companies measure hours worked, not value created. The unmeasured qualities are real; they’re just invisible to the dashboard.

This is goodharts law upstream: before the metric becomes the target, someone chose what to measure. That choice already distorted the system. McNamara didn’t corrupt good metrics; he chose metrics that couldn’t capture what he needed to see.


The antidote: remember what measurement can’t do. Quantification illuminates; it also casts shadows. The discipline is to stay curious about what lives in the shadows: the employee morale that doesn’t show up until turnover spikes, the product quality that doesn’t appear until returns flood in, the strategic position that looks fine until it suddenly collapses.

Some things that matter can be measured eventually. Others can be sensed but not counted. Wisdom is knowing which is which, and not pretending the second category is empty.

Related: goodharts law, legibility, map and territory, signal and noise, tacit knowledge