[ Last changed: 27th January 1997 ]
The ease with which design mistakes are made in computer security systems in general, and in cryptography in particular, lead us to ask whether it is possible to design systems whose security properties are robust, in the sense that they can cope with minor errors of design, implementation and operation.
However, when we look at other engineering disciplines, we see that the nature of robustness properties varies quite widely. Most civil engineering mistakes cause structures to be slightly weaker than planned, and so bridges are built to be several times stronger than they need to be; aicraft designers on the other hand duplicate critical components such as engines, instruments and pilots. We will argue that there is a comparable organising principle for computer and communications security systems.