Thursday, June 14, 2007

Robust Systems

Information systems should be robust. A robust system is one which operates correctly and efficiently under a wide range of conditions, even under conditions for which it was not specifically designed. In particular, a robust system resists attempts to make it operate incorrectly. In other words, it is secure.

While the above sounds simple enough, "correct", "efficient", and "secure" as used above are actually terms of art. An information system is a collection of devices or programs organized to change, transmit, or store data. A system can be said to be correct when its output is provable for all given inputs, and efficient when it completes its job with an acceptable and predictable use of resources.

Systems more complicated than shoelaces should be designed in two or more stages. At each stage there is the potential for debugging. The optimal breakdown for the stages of the overall development process can be thought of as finding the maximum amount of progress toward completion that can be effectively debugged, while also maximizing the likelihood of catching a mistake before the next stage.

Along with basic engineering practices, the principles that lead to robust operation are the same ones recognized by security experts:
  1. Economy of Mechanism - this means to keep things simple. Simpler processes are easier to understand and are generally more robust.
  2. Fail Safe Design - Erroneous input should result in the least harmful action.
  3. Open Design - The reliability of the system should not depend on keeping its workings hidden.
  4. Complete Access Control - All access to assets should be allowed only to those authorized.
  5. Least Privilege - Access to assets should be given only as required.
  6. Separation of Privilege - Access to assets should be based on multiple independent criteria.
  7. Least Common Mechanism - Shared means of operations should be minimized.
  8. Psychological Acceptability - If the perceived inconvenience associated with system safeguards is higher than the perceived value they allow, users will tend either to circumvent the safeguards or not to use the system. Therefore, measures should be implemented only if:
    1. They can be built in to the system such that following them will be no harder than avoiding them
    2. They are more likely to mitigate a threat than to cause user frustration
Here are some principles to bear in mind when creating correct, robust systems.
  1. Garbage In, Garbage Out: Input should be validated before it is used
  2. Efficiency: when possible, the resources used by a process should not grow faster than the size of the input. Typically the resource use should be bounded by a function of the size of the input, and for robustness, resource use should not depend on the particular input given.
  3. Correctness: a process cannot be known to be efficient unless it is known to be correct. If some inputs yield spurious results, that is not robust.
  4. Special Cases: the allowance for special cases signals that a design can be improved
  5. Hope is the Enemy of Know: The result doesn't care what you hoped it would be
  6. Expect Failure: Component failure should be an expected part of operation.
  7. Capacity: rooted in efficiency, capacity is a vital part of robustness.
So enough of these generalities. What are some examples of robust system?
  • Handicapped stalls
    Whenever possible, if I have to use a public restroom I use the handicapped stall. These stalls are roomier, and generally better lit.

  • Automatic doors
    Grocery stores have long used automatic doors, which sense the presence of a customer and open without anyone touching them. That works for anyone from small children to the elderly. It also works equally well for anyone, one of the hallmarks of a robust design.

  • 36-inch doors and accessible homes
    Similarly, while 36-inch doors make rooms accessible to those in wheelchairs, they also facilitate moving furniture.
Robustness is not just a matter of handling lots of volume. It's a question of design. A robust system is designed to handle all cases equally well, because it doesn't play favorites. The same algorithm, method, or formula used to handle the most common case handles the unusual ones. That gives the best chance for handling cases we don't even expect to encounter with the same aplomb that we handle the ones we do expect.

No comments: