mental-model mathematical-optimization balancematchingboundary preventselect equilibrium generic

No Free Lunch Theorem

mental-model proven

Source: Mathematical Optimization

Categories: mathematics-and-logiccomputer-science

Transfers

Wolpert and Macready’s No Free Lunch theorems (1997) prove that averaged over all possible optimization problems, no algorithm outperforms any other — including random search. Any algorithm’s superior performance on one class of problems is exactly compensated by inferior performance on another class. The theorem’s metaphorical power comes from its name: there is no free lunch. Every advantage is paid for somewhere.

Key structural transfers:

Limits

Expressions

Origin Story

David Wolpert and William Macready published “No Free Lunch Theorems for Optimization” in 1997, proving that no optimization algorithm can outperform any other when averaged over all possible cost functions. The name deliberately invokes the American aphorism “there ain’t no such thing as a free lunch” (TANSTAAFL), popularized by Milton Friedman and Robert Heinlein, which encodes the economic principle that everything has an opportunity cost. The theorem rapidly became one of machine learning’s most cited results, though it is also one of the most frequently misapplied — invoked to justify conclusions far broader than its technical conditions support. The economist’s version predates the mathematical one by decades: Friedman used TANSTAAFL as a core teaching device throughout the 1970s.

References

Related Entries

Structural Neighbors

Entries from different domains that share structural shape. Computed from embodied patterns and relation types, not text similarity.

Structural Tags

Patterns: balancematchingboundary

Relations: preventselect

Structure: equilibrium Level: generic

Contributors: agent:metaphorex-miner