NHS Confidentiality Consultation - FIPR Response

  1. The Foundation for Information Policy Research is an independent body that studies the interaction between information technology and society. Its goal is to identify technical developments with significant social impact, commission research into public policy alternatives, and promote public understanding and dialogue between technologists and policy-makers.
  2. The confidentiality of personal health information has caused problems for designers of healthcare IT systems in many countries. The owners of these systems usually find it convenient to have unrestricted access to medical information, while most patients are uncomfortable with their medical records being made available to people outside the circle of health professionals who care for them directly. Patients are not always represented well, or at all, when system designs are negotiated. The outcome is that large amounts of money are spent on systems that fail to protect privacy.
  3. The most expensive example may be found in the USA, where many medical record systems were organised for the benefit of insurers and also made information widely available to employers and commercial researchers. A public backlash led to the Health Insurance Portability and Accountability Act, which is imposing substantial costs and disruption on the US healthcare IT industry - despite the privacy regulations made under the Act having been watered down by fierce industry lobbying and by the change of US administration in 2001.
  4. Another example may be found in Iceland, where arrangements to sell medical information to a private company engaged in genetic research led to widespread protests by healthcare professionals and to over 10% of the country's population opting out of the scheme.
  5. The NHS has been attempting since at least 1992 to follow an information strategy which involves making the population's medical records available electronically to central government. Although this has been tinkered with, rebranded and relaunched on several occasions, the main architecture has remained unchanged. It includes a database recording all of a patient's relationships with healthcare providers, and another (`Clearing') that processes and records payments made to secondary and tertiary providers by purchasers. These are mined for management statistics and for a growing number of other applications. Although this strategy has occasionally been promoted as capable of sharing information between providers involved in a particular patient's care, this has not happened to any great extent. It is unclear that communication between providers was ever a high-priority goal of system designers. Development was rather driven by the centre's hunger for information, and clinical communications always came second.
  6. The fundamental question is whether the Department of Health should have a database containing a fairly complete record of every hospital treatment in the UK, including not just the treatment code and the cost, but also the name and address of the patient. A secondary question is whether the Department of Health should have an accessible central record of all a patient's care relationships . For example, a patient who attends a sexually transmitted diseases clinic is entitled to keep this fact secret even from his GP. It is quite unreasonable that such a record is available to ministers and civil servants, especially in view of the last few governments' interest in obtaining derogatory information not just on opponents and activists but also on members of the public involved in politically sensitive events (such as rail crash victims).
  7. FIPR believes that no one in central government - whether ministers, DoH officials or NHS central managers - should have access to identifiable health information on the whole UK population. This is backed up by studies showing that although patients trust their carers with medical information, the majority do not trust NHS administrators [1] [2] [3]. The creation of such valuable repositories will also create extreme temptation for abuse, both in the sense of unauthorised access and in the sense of authorised access being extended to more and more interests.
  8. In the short term, Clearing data should be deleted as soon as payment has cleared. Given that the data are collected for the purposes of payment, retaining them for other uses afterwards is a clear breach of data protection principles. In the longer term, purchasers and providers of healthcare should settle their accounts using decentralised payment mechanisms, and preferably using the same kind of payment mechanisms that are used by normal businesses. We can see no reason why a GP referring a patient to hospital should not write on the referral letter an order number, which would be cited on the ensuing invoice from the hospital to the PCT. There is no need for patient names, dates or birth or other identifying information to appear on payment transactions. Using standard business processes would also enable commercial off-the-shelf accounting software to be used by purchasers and providers alike.
  9. It is therefore inappropriate to use a payment mechanism as a cover for a centralised healthcare data collection exercise. It is comparable with a government, inquisitive about the usage of pornography and sex toys, insisting that all items purchased from sex shops be bought using credit cards - and that all the card payments be processed through a bank owned by the government, in order that the payment vouchers could contain detailed purchase breakdowns rather than just the billing total. The public would not accept such surveillance in the case of sex shops, and we are confident that they would not accepted it for healthcare either if they were fully aware of the extent of surveillance.
  10. Turning now to the consultation documents: `Share with Care' is not a balanced discussion of the issues but rather a report on focus group discussions. It appears from the video script that the participants were not properly briefed on patterns of information use within the NHS. The many administrative uses of identifiable personal information that were documented in the Caldicott Report are glossed over, and the emphasis is placed on more acceptable uses such as treatment and public health. Even so, 53% of those consulted did not want hospital managers to have access to any clinical information in their records, and only 22% were prepared for hospital managers to have access to full records (graph, page 24). It is curious that this graph has no explicit key; the meanings of the different coloured bars have to be inferred from the text, which goes out of its way to place a positive spin on the views expressed. It is also curious that the focus group participants were not polled on whether ministers or civil servants at the Department of Health should have access to their records. It is quite predictable from studies such as Hassey and Wells [1] [2] that the answer would have been no. The report text at page 15 does however mention that `a general rule evolved that any information released outside of NHS treatment areas should be anonymised, or patient permission sought'. The presentation of the report, however, turns this suspicion against receptionists rather than tackling the issue of whether patients trust the Secretary of State with their medical records.
  11. The design of the `Confidentiality Consultation Questionnaire' also shows the obsession with spin rather than with tackling the underlying problems. There are extensive questions about whether the report is too long, and whether the language is easy enough. The hard question, of whether the policy is right, is presented as a question on `approach' rather than principle, and much smaller reply boxes are offered for comment. The overall impression is of a department quite settled on its approach and interested only in presentational improvements.
  12. The consultation materials - including the video script and the `Model for the Future' - are also misleading in that they imply that anonymisation actually works. This is not the case in the majority of UK medical applications, as even de-identified records contain so many different facts about the patients to which they refer that re-identification is easy [4]. In most cases, supposedly de-identified records contain postcode and date of birth, a combination adequate to identify 98% of the population; an increasing number of applications include NHS number as well, and some (HIV, PHLS) even have a Soundex code of the patient's surname.
  13. Stripped of the spin, the results from the focus groups do not support the proposed policy but rather its critics. A longstanding criticism of NHS information strategy has been its focus on data flows from clinicians to administrators, rather than between clinicians or between clinicians and patients. One finds, for example, the comment that it would be nice if patients could email their doctor (p 16). It would indeed. In fact, if the NHS were patient-centred as ministers claim it to be, then email would have been introduced first to enable patients to communicate with their doctors, and then to enable doctors to communicate with each other, and finally to enable doctors to communicate with the Department of Health. That matters have been engineered the other way round is telling.
  14. The report holds out the prospect that patients will eventually be able to have their most private data sealed in an envelope that will remain under their personal control. The easy point to raise here is that such envelopes are unlikely in practice to be developed, given the distributed nature of NHS providers, the great heterogeneity of systems and the fact that the business case for the Intergated Care Record has been rejected. The report is therefore misleading by promising what is unlikely to be delivered.
  15. The hard question is whether patients will be able to forbid carers to share any identifiable information at all with the Clearing service. In one case known to us, a hospital patient who did this had his information shared anyway. Presumably this was because the hospital would not otherwise have been paid. It is easy to make promises about security, but for these to be of value it is necessary to build the right incentives into the system. Most security failures occur not because of inadequate technical measures but from inappropriate incentives, and the current regime under which carers who do not feed Whitehall with data don't get paid, is likely to undermine any technical measures to improve things. In any case, FIPR recommends that patients be entitled to forbid carers from passing personal health information to administrative systems such as Clearing, and that adequate administrative arrangements are made for the hospitals to get paid while scrupulously respecting patient's instructions.
  16. A related question is whether patients who wish sensitive treatment will be able to do so under a false name, since it appears that any offered name will be reported to the Department of Health. It has been common in the past for schoolgirls seeking contraception to offer false names, and the computerisation of item-of-service payments resulted in doctors not being paid for consultations where the given name was not on the relevant register. This caused considerable anger as doctors do not like being forced into the role of policemen. We are therefore particularly concerned at the recent consultation on `entitlement cards', which implies that medical treatment will no longer be available on the NHS at all, except to people who provide strong evidence of identity via a government-issue photo-ID. We recommend that the NHS distance itself publicly and forcefully from this bizarre Home Office view, which if implemented would make the problems already experienced with contraception pervasive throughout the NHS, with potentially severe effects on morale (to say nothing of the implications for ethics and for the management of infectious disease). The NHS should rather make it easy for patients to be treated under false names if they wish.
  17. The draft Sharing Charter calls on patients to `Accept that the NHS may, in extreme circumstances, have to refuse to treat them, if their decision to restrict sharing of information makes them untreatable or makes treatment dangerous'. This is a threat; it may be illegal, and is certainly unethical. At present records are often inaccessible, because patients are away from home, or records have been lost by hospitals. Patients are frequently treated without records, relying on information provided verbally (which may sometimes be more accurate). Any patient who declines to share appropriate clinical information with clinicians is accepting some risk - but that is their decision and no ground for refusing treatment. The right to silence extends to patients as well as to criminal defendants. The threat here is particularly unacceptable given the low priority that the NHS has given to record availability and to communications between clinicians, compared to the effort invested in central data collection.
  18. The draft Sharing Charter also says that the NHS will share identifiable information if people have a need to know it. This is completely wrong. The legal and ethical basis of information sharing is not `need to know' but consent. `Need to know' is something decided by administrators while consent is something given by the patient. If a patient has heart disease and prefers to keep this private, while a cardiology professor insists he has a `need to know' so that he can claim his research data is unbiased, then under the `need to know' doctrine the cardiologist will prevail. But under the law, and under medical ethics, it is the patient's prohibition that must prevail. In information security terms, `need to know' is also associated with hierarchical trust models where information flows up toward the top, which is what the NHS may have been working towards but is the opposite of what patients want and medical ethics demand. The phrase `need to know' has no place in this document or in any other document expressing NHS confidentiality policy.
  19. The view in the Charter that the NHS will `only give permission for others (insurers, mortgage lenders, employers, solicitors, etc.) to see relevant parts of their record unless seeing their full record is absolutely necessary' also shows an attitude on the part of the writers that is completely at odds with ethics and law. Permission is granted not by the NHS but by the patient. This casual assumption of powers that the NHS does not at present possess is extremely worrying.
  20. The Charter is vague about the action to be taken when rules about confidentiality are broken, and about the criteria to decide that a breach has occurred. FIPR believes that patients should be informed of everyone outside their immediate care team who has access to their records; that the log of who has accessed a record should be kept with the record itself, so that it will be reviewed regularly by the patient's GP and will be seen by the patient too whenever a subject access request is made; and that all breaches should be reported to the patient. The impending abolition of CHCs means that there is no independent body that can be trusted to raise the alarm rather than sweeping the problem under the carpet to minimise embarrassment to managers. Without such a provision, it can be expected that breaches will simply be reported to Caldicott guardians who will be sorely tempted to conclude that no harm appears to have been done and so no further action is required.
  21. There is also no mention in the Charter of the intention to share the Integrated Care Record with bodies outside the NHS. In view of the aversion to sharing outside the NHS that has been repeatedly expressed by patients, from the Hassey-Wells study to the focus groups reported in `Share with Care', it should be explicit policy that such sharing will only be permitted with the informed and uncoerced consent of the patient or by order of a competent court.
  22. The `Model for the Future' says that staff have an obligation to preserve confidentiality, and that they are bound by a clear and detailed code of practice. This is not reassuring. It is currently both easy and routine for private investigators to obtain personal health information by `social engineering', a fancy phrase for telling lies on the telephone. The BMA published guidelines in 1996 that would be sufficient to end this practice [5] - that when someone calls a provider or health authority asking for personal information about a patient, the request should be logged, a decision should be made by a clinician, and the caller authenticated by calling back (to a number obtained from a suitable directory rather than to a number volunteered by the caller). A pilot of this proposal at a health authority exposed some thirty false pretext calls per week. This translates to hundreds of thousands of confidentiality breaches per annum.
  23. Yet when we look at the `Draft Code of Practice' we find that the need to identify enquirers gets a mere five lines (Annex A section 5b). The bulk of the Code is given over to reassuring patients and to legal boilerplate.
  24. NHS managers are surely aware that given the avalanche of directives, targets, executive letters, guidance notes and other paperwork with which medics and others in the NHS are deluged, there is little likelihood that five lines in an annex to a policy document will be widely noticed, let alone cause effective change. The issue here is not understanding the problem, but doing something about it; and the NHS culture encourages managers to bury problems whenever possible. For example, the 1996 pilot of the BMA guidelines was not extended to all health authorities, hospitals and surgeries, but closed down as it was found to be embarrassing. So while FIPR welcomes this first mention in an official policy document of the problem of false-pretext phone calls, we are very concerned that the likely effect of the Code will be to reduce the risk to ministers rather than to reduce the risk to patients. Now that NHS staff have been formally advised (in some sense) of the problem, future confidentiality breaches will be their responsibility rather than that of the Secretary of State. This is most unhelpful.
  25. FIPR therefore recommends that the risks of false-pretext phone calls, and other `social engineering' means of compromising patient privacy, be given much greater prominence, and that the Code require implementation of the BMA guidelines in full. It should also be followed up as necessary with training, penetration testing and publicity. The NHS should also monitor the `street price' of medical records and attempt to raise this from the hundreds of pounds into at least the thousands.
  26. These five recommendations: will make a good start and will go a long way to making up the trust that has been lost. They will also create a firmer foundation for future data-intensive medical research projects such as Biobank.
  27. However, if NHS managers just continue with the old policy and think they can fix its consequences by ever more aggressive spin-doctoring, then sooner or later there will be a scandal. Patient privacy arouses strong emotions and is a legal right. Playing fast and loose with it is reckless. NHS administrators should bear in mind that carelessness over laboratory animal welfare in the 1970s led to the current problems with animal rights activists, and that the relaxed attitudes towards taking specimens from patients (living and dead) led to the Alder Hey scandal. The current practice of confidentiality in the NHS is similarly an accident waiting to happen.
  28. Finally, we would like to object to the way the consultation has been managed. Its public visibility has been zero, and a consultation so poorly advertised cannot really be considered to have been a public consultation. Furthermore, the website does not work for Netscape, the the documents were not downloadable when we tried to get them using Microsoft Internet Explorer (eventually we got a copy from the Google cache). When we were finally ready to submit our response, we could find no email address to send it to, and got no response from the helpline. The NHSIA's handling of the mechanics of this consultation does not bode well for the safety and privacy of electronic medical records in the UK.
  29. Bibliography

    1. Hassey GA, Wells M. Clinical Systems Security - Implementing the BMA Policy & Guidelines. Personal Medical Information - Security, Engineering and Ethics. Springer 1997:79-94: Anderson R (Ed): ISBN 3-540-63244-1.
    2. Wells M, Hassey GA, Wilson A, Pearson D. Diabetic Registers - A Practice Survey of Patients' Attitudes. Health Informatics Journal vol 4 (3 & 4) 216-222, Sheffield Academic Press. Dec. 1998
    3. Singleton P. ERDIP Evaluation Project N5 - Patient Consent and Confidentiality Study Report. NHS Information Authority. May 2002.
    4. Anderson RJ. Security Engineering - A Guide to Building Dependable Distributed Systems. Wiley 2001.
    5. Anderson RJ. Clinical System Security - Interim Guidelines. British Medical Journal vol 312 no 7023 109-111 (13th January 1996)
[Some URLs updated June 29th 2005]