Survivorship Bias
During World War II, the U.S. military studied bullet holes in returning aircraft. The obvious conclusion: add armor where the holes are. Abraham Wald, a statistician, pointed out the problem. They were only seeing planes that survived. The holes showed where planes could take damage and still return. The missing holes marked the fatal zones.
They should armor where returning planes weren’t hit.
Once you see this pattern, you can’t unsee it.
Successful entrepreneurs tell you to drop out of college and follow your dreams. But you never hear from the thousands who dropped out and failed. Their advice would be different.
Published authors say persistence is everything. But authors who persisted just as long and never got published aren’t giving interviews.
Mutual funds advertise their best performers. But funds that underperformed get quietly closed, and their track records vanish from the marketing materials.
The evidence that reaches you has already been filtered. You’re not seeing reality. You’re seeing what made it through.
The hard part is that the missing data often doesn’t exist. Failed startups don’t publish postmortems. Unpublished writers don’t leave records. The graveyard is dark.
When someone offers advice backed by success stories, I try to ask: how large is the graveyard? How many tried the same thing and failed? Usually nobody knows. Sometimes nobody can know.
But even recognizing the bias helps. Success stories feel like proof. They’re evidence of what’s possible, not evidence of what’s probable.
Related: epistemology, fat tails, signal and noise, selection, risk