Hindsight Bias
After the 2008 financial crisis, everyone saw it coming. The signs were obvious: subprime mortgages, leverage ratios, credit default swaps. Except almost no one did see it coming — and those who did were mostly ignored. Once we know an outcome, we reconstruct the past as if the outcome were inevitable. This is hindsight bias: the illusion that we would have predicted what we only know in retrospect.
The mechanism rewrites memory. Our minds integrate the outcome into our representation of the past, making signals that pointed toward it seem stronger and signals that pointed away seem weaker. We don’t just believe we knew — we genuinely remember understanding more than we did.
Hindsight bias creates several problems. It makes us overconfident in our predictive abilities — we think we see patterns we only see in rearview. It makes us unfairly judge past decisions — we evaluate choices by outcomes that weren’t foreseeable. It makes us bad at learning from experience — we can’t calibrate prediction quality if we misremember our predictions.
The bias is particularly vicious in complex domains. Markets, politics, technology — these generate surprises that, after they happen, look obvious. Each surprise confirms our sense that we understand the system. We never confront the limits of our foresight because we’ve edited those limits from memory.
Partial antidotes exist. Writing down predictions before outcomes. Keeping records of uncertainty. Forcing explicit probability estimates. These create external memory that resists post-hoc revision. But even with records, the feeling of obviousness persists. Knowing about hindsight bias doesn’t make the past feel unpredictable.
The practical lesson: when evaluating decisions, ask what was knowable at the time, not what is known now. And when claiming you “saw it coming,” check your records. The past is less predictable than it looks from the future.
Related: epistemology, narrative fallacy, survivorship bias, signal and noise, regression to the mean