The Misapplication of Occam’s Razor or the Principle of Inappropriate Parsimony

There is a class of design problem that can mislead the unwary systems designer, myself included, although I am getting better at identifying the warning signs. These design problems require the designer to base the solution on a conceptual model of a real world system or process. It often appears that a simple conceptual model that approximates to the real world system will suffice. In practices solutions based on these simple, low fidelity, models fail to handle the problem completely and often causes a whole series of new problems. Redesigning the solution to handle these exceptions only produces more problems that require more redesign and so on. If the hapless designer persists s/he will often go through several complete redesigns before getting to a solution that finally solves the problem by modeling the real world system with a high degree of fidelity. This iterative redesign process is typical of this class of design problem. Some might say that only poor designers are ever trapped in this way, others will say these solutions are anti patterns. But I think there is more too it.

The type of design problem under consideration here is particularly intractable because there exists a series of approximate conceptual models of increasing fidelity and complexity. Faced with situations like these many systems designers will claim to be applying Occam’s razor when they opt to base their solution on the simplest conceptual model. But, a solution based on an approximation is only as good as the approximation. The only way to improve such a solution, if it is insufficient, is to replace the low fidelity conceptual model with one of higher fidelity. The worst type of problem is one that has many plausible conceptual models each of slightly higher fidelity and complexity than the last. The slavish misapplication of the principle of parsimony will condemn our systems designer to step through each successively higher complexity model until they finally reach one with the required fidelity.

Occams Razor – When faced with several explanations of a phenomenon One should always choose the simplest, the one that requires the fewest leaps of logic

This leads to a tentative conclusion: Occam’s Razor is no good for selecting between alternative models if the alternatives are approximations with differing fidelity. A simple low fidelity model cannot be compared with a complex high fidelity one.

An old joke comes to mind. An engineer, a physicist, a mathematician and a biologist were asked to define Pi: The engineer said About 3, the physicist said 3.14159 +- 0.00001, the mathematician said The circumference of a circle divided by its diameter and the biologist said What is Pi. Choosing the least complex, lowest fidelity model is not always the right answer! Too many systems designers think there is virtue in always assuming Pi should be about three.

This entry was posted in System Design and tagged , . Bookmark the permalink.
  • Mike B.

    OK, John, I’ll finally make some comments, but I have to work my way up from the bottom and start with this entry.I think I understand the point you are trying to make about problems with many plausible conceptual models with increasing degrees of complexity. However, can’t it be said that the point of the analysis and design phase is to pick the correct conceptual model and then evolve it through as many iterations as necessary until it has the right level of detail? I know you say the following:”This iterative redesign process is typical of this class of design problem. Some might say that only poor designers are ever trapped in this way…”but I think that getting the correct level of detail by evolving the architecture *is* a skill and an art that good designers (who may not be in great number) have naturally and/or acquire through experience.Or perhaps another way to look at this class of problems is that the lower fidelity solutions are not one of the “several explanations of a phenomenon” described in Occams Razor and thus should not be part of the set of potential solutions. That is, the problem may not be in applying Occams Razor to these types of problems, but instead getting the right set of possible solutions (with the appropriate amount of detail) to choose the simplest solution from. So the difficult part of these types of problems may actually be determining the set of potential models. Of course, I may not what I’m talking about here. Plus, you’ve designed and reviewed many more systems than I have ever seen, so take my comments with a grain of salt!

  • http://www.virtualtravelog.net/ John

    Glad you made the time to comment Mike. I agree with your suggestion that the point of analysis and design is to pick the correct conceptual model and the evolve it. But this type of problem is almost always peripheral to the main focus of the system. The designer does not want to focus on these issues, in fact they want to expend as little time as possible on them. They want to identify the conceptual model and move on. The fact that these problems appear to have a simple solution is why they are so nasty. Some examples should help illustrate the point. Correct handling of time zones, supporting multiple writing systems and ensuring two remote transactional systems contain the same data are all problems with simple low fidelity conceptual models that are reasonable if you squint your eyes and stand at a distance but they fall apart when you examine them closely.

    By simplifying the conceptual model the busy designer can focus on the main area of the solution. This is also one of the key skills of a good designer. But in these cases it is a false economy. This is the miss application of Occams razor.

    You say the problem may be identifying the right set of candidates from which to choose the solution. I agree in an ideal world the designer would produce multiple candidate designs. But the world is far from ideal; the average designer never has enough time. Part of the art of being a designer is knowing when you can get away with using a low fidelity solution. The problem we are discussing here is that there are some problems that are particularly deceptive when you take this approach.

  • Mike B.

    Hi John -

    After I posted my response, I was wondering if I missed your point a little…maybe because I was under pressure from you to respond to something? =) Also, perhaps I was confusing an interpretation of Occams Razor that I am more familiar with:

    “when you have two competing theories which make exactly the same predictions, the one that is simpler is the better”

    (I found this quote by searching the Web using Google.)

    Also, I was probably adding in some of the Einstein quote which is one I usually think of when discussing making things simple: “things should be made as simple as possible, no simpler”. My argument wasn’t necessarily that multiple designs should be created. I think I was arguing that:

    1. The design process should be iterative and this evolution would result in a design with the correct fidelity. Knowing when to stop is a where a designer’s skill and expertise come in.
    2. According to the above interpretation of Occams Razor, designs with different degrees of fidelity do not make “exactly the same predictions” so Occams Razor is not violated, just not appropriate. (Which may just be a restatement of your original point.) A good example, which you describe in another entry, is time zones, especially with respect to daylight savings and leap days/minutes. In this case, a lower fidelity solution would not make exactly the same predictions (i.e. give the same results) as the design you outline, and, therefore, applying Occams Razor is not appropriate.

    From my limited experience, complex systems are difficult to deal with and thus a lot of techniques and rules (such as Occams Razor) are applied to try to simplify them so we can get our minds around them and deal with them. So, is your point really that, in some cases, these techniques break down and that the designer needs to be aware of them? Anyway, I don’t know if I’m being very clear in this post, but I probably wasn’t very clear in my first one. I think that the last part of your response to me above sums up your point very well (and I definitely agree with it):

    Part of the art of being a designer is knowing when you can get away with using a low fidelity solution. The problem we are discussing here is that there are some problems that are particularly deceptive when you take this approach.

  • http://www.virtualtravelog.net John

    Mike

    Your point 2 sums up what I was saying well. Also your definition of Occam’s razor is better than mine since it stresses that competing theories can only be compared when they make the same predictions. My point is that a busy designer may think solutions of differing complexity make the same predictions when in fact they have different levels of fidelity and cannot be compared. Too many designers fail to realize this and only later discover the devil in the details as the solution falls apart in exceptional situations.

    Additionally, I have come across too many designers who think it is always better to simplify. While I agree it is usually better to simplify, I believe this class of problem proves they are wrong to be so absolute.