next up previous contents
Next: The Trusted Computing Up: Security Policy Previous: Information flow

Aggregation control

The use of access control lists and strong notification are helpful against aggregation threats but are not quite enough to prevent them. The clinician in charge of a safe-haven might be added to the access control lists of millions of hospital patients, making her vulnerable to inducements or threats from illegal information brokers.

Principle 8: There shall be effective measures to prevent the aggregation of personal health information. In particular, patients must receive special notification if any person whom it is proposed to add to their access control list already has access to personal health information on a large number of people.

Some hospitals' systems contain personal health information on a million or more patients, with all users having access. The typical control at present is a declaration that unjustified access will result in dismissal; but enforcement is often sporadic, and incidents such as the Jackson case continue to be reported. In general, hospital systems generally tend to be old and poorly administered [AC95a] [AC95b].

Hospital systems which give all clinicians access to all data should not be connected to networks. Having 2,000 staff accessing a million records is bad enough; but the prospect of 200 such hospitals connected together, giving 400,000 staff access to the hospital records of most of the population, is unacceptable.

However, there will inevitably be mechanisms for clinicians to access records from outside their own care team, even if these are manual ones. These mechanisms need careful design. As noted above, a corrupt member of staff might falsely claim that a patient has self-referred while on holiday, and ask for a copy of the record to be sent. Even a simple electronic mail system could enable such enquiries to be repeated on an industrial scale.

The primary control on such threats is notification. However an important secondary control is to keep a count somewhere of who has accessed what record outside their own team. Users who access many records, or a number of records outside the usual pattern, may just be lazy or careless, but they could still be exposing themselves and their colleagues' patients to harm.

Given the tension between clinicians and administrators on privacy issues, both the location of this count and the choice of the persons responsible for acting on it should be chosen carefully: it might for example involve the clinical disciplinary bodies or healthcare unions. It would also make sense to deal with reports of other computer abuse at the same place. The involvement of the clinical unions may help prevent the central security function being captured by bureaucratic interests and thus preserve the principle of consent.

There are applications in which some aggregation may be unavoidable, such as childhood immunisation programmes. Systems to support them will have to be designed intelligently.

As mentioned above, records may be aggregated for research and audit purposes provided that they are made sufficiently anonymous. It has been suggested that records can be made anonymous by replacing names with NHS numbers and diagnoses with Read codes [RSM92], and a number of systems appear to have been specified on the assumption that this is acceptable. It is not; as noted above, the existing GMSC/RCGP guidelines stipulate that no patient should be identifiable, other than to the general practitioner, from any data sent to an external organisation without the informed consent of the patient [JCG88]

Making data anonymous is hard, especially if it contains linkable information: if an attacker can submit database queries such as `show me the records of all females aged 35 with two daughters aged 13 and 15 both of whom suffer from eczema', then he can identify individuals. The limits of linkage, and techniques for preventing inference, are known as `statistical security' and have have been researched in detail in the context of census information [Den82]. Where purely statistical research is proposed, then these techniques may be used; where they are impractical, researchers might be granted access to linkable data within protected space [Boe93].



next up previous contents
Next: The Trusted Computing Up: Security Policy Previous: Information flow



Ross Anderson
Fri Jan 12 10:49:45 GMT 1996