Occam's Razor
mental-model
Source: Tool Use
Categories: philosophycognitive-science
From: Poor Charlie's Almanack
Transfers
A cutting tool mapped onto explanation selection. Among competing hypotheses that equally account for the evidence, prefer the one with the fewest assumptions. The razor metaphor is precise: it does not build or create; it removes. The instrument’s job is to shave away unnecessary complexity, leaving only what is required to explain the observed facts.
The mapping structures how we evaluate explanations:
- Simplicity is a selection criterion, not a truth criterion — the razor does not claim simple explanations are always correct. It claims that unnecessary complexity is a cost, and costs need justification. Every additional assumption in a theory is another place where the theory can be wrong. The razor cuts the assumptions that do not pay their way by explaining additional evidence.
- Entities should not be multiplied beyond necessity — William of Ockham’s original formulation. If one cause explains the data, do not posit two. If a natural process explains the phenomenon, do not invoke a supernatural one. The standard is necessity: you can add complexity, but only when the simpler explanation fails.
- The razor is comparative, not absolute — it operates on pairs of explanations, preferring the simpler of two that explain the same data equally well. It does not say “simple is good” in the abstract. A simple explanation that fails to account for the evidence is not preferred; it is wrong. The razor only applies when explanatory power is held constant.
- Parsimony maps onto probability — there is a formal justification for Occam’s razor in Bayesian statistics and information theory. Simpler models have higher prior probability because they make fewer specific claims about the world. Solomonoff induction formalizes this: shorter descriptions of data are more probable a priori. The intuitive razor turns out to have mathematical backing.
Munger used Occam’s razor as a guard against over-engineered explanations in investing. When a company’s results can be explained by simple economics (good product, growing market), do not reach for elaborate narratives about visionary management or paradigm shifts. The simplest sufficient explanation is usually the most reliable.
Limits
- “Simplicity” is not well-defined — what counts as simpler? Fewer variables? Fewer parameters? Fewer types of entity? A model with three variables and complex nonlinear interactions may be “simpler” in entity count but more complex in behavior than a model with ten linear variables. The razor assumes we can rank explanations by simplicity, but the ranking is often ambiguous.
- Reality is not obligated to be simple — evolution produces baroque, redundant, jury-rigged systems. Quantum mechanics is deeply counterintuitive. The history of science includes many cases where the correct explanation was the less parsimonious one. Continental drift was rejected for decades partly because it seemed to require too many novel mechanisms. The razor can delay acceptance of correct but complex theories.
- The razor can be weaponized against nuance — “the simplest explanation” is frequently deployed to shut down discussion rather than to refine it. Complex social phenomena (inequality, institutional racism, market failures) resist simple monocausal explanations, and invoking Occam’s razor to demand one can be intellectually dishonest.
- It assumes the evidence is complete — the razor selects among hypotheses that explain the current data. But if the data is incomplete (as it usually is), the simpler hypothesis may only appear adequate because the complicating evidence has not yet been gathered. Premature application of the razor can lock in an underfitted model.
- The tool metaphor implies a single clean cut — real intellectual work involves many small judgments about what to include and exclude, not one decisive stroke. The razor image makes parsimony seem like a moment of clarity rather than the slow, iterative process it actually is.
Expressions
- “The simplest explanation is usually the best” — the folk version, slightly distorted (Ockham said “necessary,” not “best”)
- “Entities should not be multiplied beyond necessity” — the traditional Latin-derived formulation
- “Don’t make things more complicated than they need to be” — the pragmatic restatement
- “Shave away the unnecessary” — using the razor metaphor directly
- “KISS: Keep It Simple, Stupid” — the engineering variant, applied to design rather than explanation
- “When you hear hoofbeats, think horses, not zebras” — the medical version: common diagnoses before rare ones
- “Prefer the boring explanation” — startup and tech culture’s version, applied to system failures and market dynamics
- “Extraordinary claims require extraordinary evidence” — Sagan’s formulation, the razor applied to claims rather than hypotheses
Origin Story
The principle is attributed to William of Ockham (c. 1287-1347), an English Franciscan friar and scholastic philosopher, though he never stated it in the exact form usually quoted. His actual writings contain formulations like “Plurality must never be posited without necessity” (Pluralitas non est ponenda sine necessitate). The name “Occam’s razor” was applied retroactively, the “razor” metaphor appearing in the seventeenth century to describe principles that “shave away” unnecessary hypotheses.
The principle predates Ockham. Aristotle wrote that “the more limited, if adequate, is always preferable.” Ptolemy stated “we consider it a good principle to explain the phenomena by the simplest hypothesis possible.” But Ockham applied it with particular force to metaphysical debates about universals, cutting away Platonic entities he considered unnecessary.
In the twentieth century, the razor acquired formal grounding through information theory (Kolmogorov complexity), Bayesian model selection (Bayes factors penalize model complexity), and machine learning (regularization as a computational implementation of parsimony). What began as a medieval philosophical intuition turned out to have deep mathematical structure.
Munger included it in his mental models toolkit as a general-purpose filter for evaluating business narratives, investment theses, and causal explanations. His practical deployment: when a company’s stock price requires an elaborate story to justify, the story is probably wrong.
References
- William of Ockham. Summa Logicae (c. 1323) — the source writings, though the “razor” label came later
- Thorburn, W.M. “The Myth of Occam’s Razor” (1918), Mind 27(107) — on the attribution history
- Sober, E. Ockham’s Razors: A User’s Manual (2015) — the most thorough modern philosophical treatment
- Solomonoff, R. “A Formal Theory of Inductive Inference” (1964) — the information-theoretic formalization
- Kaufman, P. (ed.) Poor Charlie’s Almanack (2005/2023) — Munger on parsimony in investment analysis
Related Entries
Structural Neighbors
Entries from different domains that share structural shape. Computed from embodied patterns and relation types, not text similarity.
- Hoofbeats, Think Horses (medicine/mental-model)
- The Law Does Not Concern Itself with Trifles (governance/mental-model)
- Treat the Patient, Not the Test (medicine/mental-model)
- Worse Is Better (natural-selection/paradigm)
- Falsification (/mental-model)
- Triage (medicine/metaphor)
- Animals Are Moral Agents (animal-behavior/metaphor)
- Surgical Precision (medicine/metaphor)
Structural Tags
Patterns: removalmatchingscale
Relations: selectprevent
Structure: hierarchy Level: generic
Contributors: agent:metaphorex-miner