*does not*mean "All other things being equal, the simplest solution is the best." Here's the quote:

Ockham's razor (sometimes spelled Occam's razor) is a principle attributed to the 14th-century English logician and Franciscan friar, William of Ockham. The principle states that the explanation of any phenomenon should make as few assumptions as possible, eliminating those that make no difference in the observable predictions of the explanatory hypothesis or theory. The principle is often expressed in Latin as the lex parsimoniae ("law of parsimony" or "law of succinctness"): "entia non sunt multiplicanda praeter necessitatem", roughly translated as "entities must not be multiplied beyond necessity". An alternative version "Pluralitas non est ponenda sine necessitate" translates "plurality should not be posited without necessity". [1]

This is often paraphrased as "All other things being equal, the simplest solution is the best." In other words, when multiple competing hypotheses are equal in other respects, the principle recommends selecting the hypothesis that introduces the fewest assumptions and postulates the fewest entities. It is in this sense that Occam's razor is usually understood. This is, however, incorrect. Occam's razor is not concerned with the simplicity or complexity of a good explanation as such; it only demands that the explanation be free of elements that have nothing to do with the phenomenon (and the explanation).

Originally a tenet of the reductionist philosophy of nominalism, it is more often taken today as an heuristic maxim (rule of thumb) that advises economy, parsimony, or simplicity, often or especially in scientific theories. Here the same caveat applies to confounding topicality with mere simplicity. (A superficially simple phenomenon may have a complex mechanism behind it. A simple explanation would be simplistic if it failed to capture all the essential and relevant parts.)

## 4 comments:

I've been playing with some statistics related to this idea recently; specifically, the lasso:

http://www-stat.stanford.edu/~tibs/lasso.html

The idea is to fit a least squares regression with the minimum number of free parameters (i.e. the fewest assumptions). The tricky bit is weighing the number of assumptions against the quality of the fit ("with enough free parameters, I can fit you a horse") and figuring out how to interpret the complex fitting algorithm in terms of degrees of freedom.

Annals of Statistics 35:2173 gives a neat proof that the degrees of freedom are directly proportional to the number of fit parameters, and Annals of Statistics 32:407 gives an implementation with the same computational cost as a regular least squares fit.

I think I just heard something flying over my head.

Sounds like fun, Mark :)

Coincidentally, right before you checked I totally made that up and put it in wikipedia.

They have since reverted my edit; it's now back to normal.

And seriously, did you really believe that complicated explanation of Occam's Razor? Obviously the simpler explanation is correct.

That's hilarious!

Post a Comment