The NHSE's cryptography strategy includes a number of fairly detailed costings. However, these lack credibility for a number of reasons. We have already mentioned the lack of provision for software integration of hardware crypto functions under `Global Cost Model 1' (p 38); there are many more omissions and dubious assumptions.
For example, the strategy assumes that there will be one, or at most two, `trusted third party' key management centres, with all the key management tasks performed by eight full time staff equivalent. Assuming two million personnel changes in the NHS each year (hirings, firings, and moves), this would entail each member of staff administering about a hundred key changes per hour, which is perhaps an order of magnitude more than what could be achieved with even fairly cursory checking of credentials and entitlements. So even under IMG assumptions, the number of staff needed would be more like a hundred.
But these assumptions are suspect on a number of grounds. According to the strategy, the TTP centre or centres would not be staffed at evenings and weekends. So if a key were compromised on Friday evening, it could not be revoked until Monday -- leaving the attacker a whole weekend to attack systems, steal personal health information, forge prescriptions, and so on. This is not acceptable. Even in the banking sector, where lives are not usually at risk, a 24-hour revocation facility is considered essential; a bank in Germany that did not provide one was held to be liable for card fraud .
Furthermore, as discussed above, there would be an enormous overhead in administering a centralised key management system. A GP who hired a locum for the day would presumably have to call the key management centre to get key material issued so that the locum could be added to the relevant access control lists. Assuming that the phone were answered, and assuming that there were no hitches, and assuming that ten requests could be handled per hour (which would be brisk), then the GP would still have spent perhaps ten minutes on the operation. This has also not been budgeted for.
In reality, as discussed above, there will be not one certification authority but many. The strategy takes no account of this in its costings. Nor does it attempt to make explore the cheaper functional alternatives.
A second cost issue is standards development. The report does not budget for this either. The issue having been raised, the supplementary strategy document claims that standards costs had been included in the £100 per user estimated licence fee for encryption software. This price is indeed high, as the supplier of the encryption software for GP-provider links is charging only £25 and PGP can be licensed for SFr30 (under £20); so the explanation appears plausible. However it was not made explicit in the strategy document, and in any case raises the question of whether a monopoly provision of cryptographic services is likely or indeed necessary for the standards work to be done. The long term cost implications for the NHS of such a monopoly also need to be considered.
A third missing cost is training. On p 30, the strategy document asserts that cryptographic security can be implemented without significant expenditure on user training, and on p 53 that the entire operation of cryptographic security can be made invisible to the user. This contradicts both commonsense and a talk given by IMG's senior adviser at ESORICS 94 in Brighton.
A fourth missing cost is that of system migration. Introducing new ways of doing things is never free; yet the strategy's line is that `much of the above project manpower could be provided by the NHS from its own staff resources'. In other words, the stated costs apply to items such as equipment purchase and security consultancy. The strategy appears to ignore the fact that a large part of NHS computing is contracted out, and that `own staff resources' are meagre. Even where they exist, they still have to be paid for.
A fifth missing cost is evaluation. The report contains no budgetary provision for the assessment and testing of cryptographic products or implementations; it simply contains items such as `HISS -- 6 systems at £75K' (p 39). But evaluation of these changes is necessary and expensive. The report suggests the use of ITSEC ; yet an ITSEC evaluation costs in the range of £100,000 to £1,000,000 depending on the product's complexity. At only £100,000 per system, evaluation will more than double the figures in `Global Cost Model 2'.
When tackled on this point in the June meeting, IMG agreed that evaluation costs had been omitted, stating that ITSEC evaluation was in their view an expensive and unnecessary piece of bureaucracy. This view would not be supported by the director of CESG, according to whom almost all products submitted for ITSEC evaluation have had serious vulnerabilities found and corrected, with some products having as many as twenty . In our experience too, the large majority of system security failures result from errors in implementation or operation that are discovered by chance and exploited in an opportunist way [3, 13]. Reducing the incidence and severity of these blunders is the purpose and function of evaluation. Ignoring evaluation will not only lead to incorrect and inadequate budgeting, but is also at odds with both government policy and best practice.
For all of the above reasons, we believe that the IMG's cost estimates are not robust and will not stand up to the Public Accounts Committee's scrutiny. A realistic strategy must recognise the need to defend cost estimates from this standpoint.