While the above sounds simple enough, "correct", "efficient", and "secure" as used above are actually terms of art. An information system is a collection of devices or programs organized to change, transmit, or store data. A system can be said to be correct when its output is provable for all given inputs, and efficient when it completes its job with an acceptable and predictable use of resources.
Systems more complicated than shoelaces should be designed in two or more stages. At each stage there is the potential for debugging. The optimal breakdown for the stages of the overall development process can be thought of as finding the maximum amount of progress toward completion that can be effectively debugged, while also maximizing the likelihood of catching a mistake before the next stage.
Along with basic engineering practices, the principles that lead to robust operation are the same ones recognized by security experts:
- Economy of Mechanism - this means to keep things simple. Simpler processes are easier to understand and are generally more robust.
- Fail Safe Design - Erroneous input should result in the least harmful action.
- Open Design - The reliability of the system should not depend on keeping its workings hidden.
- Complete Access Control - All access to assets should be allowed only to those authorized.
- Least Privilege - Access to assets should be given only as required.
- Separation of Privilege - Access to assets should be based on multiple independent criteria.
- Least Common Mechanism - Shared means of operations should be minimized.
- Psychological Acceptability - If the perceived inconvenience associated with system safeguards is higher than the perceived value they allow, users will tend either to circumvent the safeguards or not to use the system. Therefore, measures should be implemented only if:
- They can be built in to the system such that following them will be no harder than avoiding them
- They are more likely to mitigate a threat than to cause user frustration
- Garbage In, Garbage Out: Input should be validated before it is used
- Efficiency: when possible, the resources used by a process should not grow faster than the size of the input. Typically the resource use should be bounded by a function of the size of the input, and for robustness, resource use should not depend on the particular input given.
- Correctness: a process cannot be known to be efficient unless it is known to be correct. If some inputs yield spurious results, that is not robust.
- Special Cases: the allowance for special cases signals that a design can be improved
- Hope is the Enemy of Know: The result doesn't care what you hoped it would be
- Expect Failure: Component failure should be an expected part of operation.
- Capacity: rooted in efficiency, capacity is a vital part of robustness.
- Handicapped stalls
Whenever possible, if I have to use a public restroom I use the handicapped stall. These stalls are roomier, and generally better lit.
- Automatic doors
Grocery stores have long used automatic doors, which sense the presence of a customer and open without anyone touching them. That works for anyone from small children to the elderly. It also works equally well for anyone, one of the hallmarks of a robust design.
- 36-inch doors and accessible homes
Similarly, while 36-inch doors make rooms accessible to those in wheelchairs, they also facilitate moving furniture.