University of Cambridge Computer Laboratory foto

Ross Anderson

[Research] [Blog] [Politics] [My Book] [Music] [Contact Details]

What's New

We have scholarships for PhD students from 2015. Please email me if you're interested!

Privacy versus government surveillance – where network effects meet public choice examines the economics of surveillance: the Snowden papers show that information economics applies to the NSA and GCHQ, just as it applies to Google and Microsoft (blog Schneier)

Chip and Skim: Cloning EMV Cards with the Pre-Play Attack describes attacks based on design and implementation flaws in EMV that are actually being exploited in Spain, Poland and elsewhere (blog)

Security protocols and evidence: where many payment systems fail analyses why dispute resolution is hard. In a nutshell, the systems needed to support it properly just don't get built (blog).

In Why bouncing droplets are a pretty good model of quantum mechanics, we solve an outstanding mystery in physics (see blog posts, three previous papers and older blog posts).

Reading this may harm your computer – The psychology of malware warnings analyses what sort of text we should put in a warning if we actually want the user to pay attention to it (blog).

2013 highlights included Rendezvous, a prototype search engine for code; a demonstration that we could steal your PIN via your phone camera and microphone; an analysis of SDN Authentication; and papers on quantum computing and Bell's inequality.

2012 highlights included a big report on Measuring the Cost of Cybercrime and a history of security economics; an attempt to kill the government's smart metering project; three papers on dynamic networks; and four papers on payment protocols: Chip and Skim: cloning EMV cards with the pre-play attack, How Certification Systems Fail, A birthday present every eleven wallets? and Social Authentication – harder than it looks. Finally, Risk and privacy implications of consumer payment innovation discusses both payment and economic issues.

2011 highlights included a major report on the Resilience of the Internet Interconnection Ecosystem which studies how an attacker might bring down the Internet; an updated survey paper on Economics and Internet Security which covers recent analytical, empirical and behavioral research; and Can We Fix the Security Economics of Federated Authentication? which explores how we can deal with a world in which your mobile phone contains your credit cards, your driving license and even your car key. What happens when it gets stolen or infected? (blog)

2010 highlights included a paper on why Chip and PIN is broken for which we got coverage on Newsnight and a best paper award (later, the banks tried to suppress this research). Other bank security work included a paper on Verified by VISA and another on the unwisdom of banks adopting proprietary standards. On the control systems front, we published papers on the technical security and security economics of smart meters, on their privacy, on their deployment and on key management for substations. I created a psychology and security web page and wrote a paper on putting context and emotion back in security decisions.

2009 highlights included Database State, an influential report we wrote about the failings of public-sector IT in Britain (a number of its recommendations have been adopted by the new government); The snooping dragon which explains how the Chinese spooks hacked the Dalai Lama in the run-up to the Peking Olympics; Eight Friends are Enough, which shows how little privacy you have on Facebook; and The Economics of Online Crime. There are also videos of talks I gave on dependability at the IET, Krakow and De Montfort, as well as a survey paper, the slides, and a podcast. Finally, I wrote an Unauthorised History of Cambridge University.

2008 highlights included a major study of Security Economics and European Policy for the European Commission; the second edition of my book "Security Engineering"; the discovery of serious vulnerabilities in Chip and PIN payment systems; an analysis of the failings of the Financial Ombudsman Service (see also a video from the World Economic Forum in November 2008); the FIPR submission to the Thomas-Walport Review; a piece on confidentiality in the British Journal of General Practice; three videos on privacy made by ARCH; and a video on surveillance. I started a Workshop on Security and Human Behaviour to bring together psychologists with economists and security engineers to work on deception and risk.

2007 highlights included technical papers on RFID and on New Strategies for Revocation in Ad-Hoc Networks (which explores when suicide attacks are effective); a Google tech talk on searching for covert communities online; a paper on fraud, risk and nonbank payment systems I wrote for the Fed; and a survey paper on Information Security Economics (of which a shortened version appeared in Science). I was a special adviser to House of Commons Health Committee for their Report on the Electronic Patient Record. Finally, following the HMRC data loss, I appeared in the debate on Newsnight.

2006 highlights included technical papers on topics from protecting power-line communications to the Man-in-the-Middle Defence, as well as a major report on the safety and privacy of children's databases for the UK Information Commissioner, which got a lot of publicity. I ended the year by debating health privacy on the Today programme with health minister Lord Warner, who resigned shortly aftewards.

2005 highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a big survey paper on cryptographic processors.

2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amendments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs.


Research

I am Professor of Security Engineering at the Computer Laboratory. My research students are Sheharbano Khattak, Kumar Sharad, Laurent Simon, Rubin Xu and Dongting Yu. Richard Clayton, Sergei Skorobogatov, David Modic, Sophie van der Zee and Alice Hutchings are postdocs. I'm also collaborating with Robert Brady. Alumni include former postdocs Mike Bond, Vashek Matyas, Steven Murdoch and Andrei Serjantov, while Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi, Richard Clayton, Jolyon Clulow, Hao Feng, Andy Ozment, Tyler Moore, Shishir Nagaraja, Robert Watson, Hyoungshick Kim, Shailendra Fuloria, Joe Bonneau and Wei-Ming Khoo have earned PhDs.

My research topics include:

By default, when I post a paper here I license it under the relevant Creative Commons license, so you may redistribute it with attribution but not modify it. I may subsequently assign the residual copyright to an academic publisher.

Economics and Psychology of information security

As systems scale globally, incentives start to matter as much as technology. Systems break when the people who could fix them are not the people who suffer the costs of failure. So it's not enough for security engineers to understand cryptomathematics and the theory of operating systems; we have to understand game theory and microeconomics too. This has led to a rapidly growing interest in ‘security economics’, a discipline I helped to found. This discipline is starting to embrace dependability and software economics; at the other end, it's growing through bevaioural economics into the psychology of security. I maintain the Economics and Security Resource Page and a similar web page on Security Psychology. There is also a web page on the economics of privacy, maintained by Alessandro Acquisti. My research contributions include the following.

There are two annual workshops I helped establish. On the psychology side, the Security and Human Behaviour workshop is great fun and hugely productive. On the economic side, the Workshop on Economics and Information Security is now into its thirteenth year and attracts over a hundred participants.


Peer-to-Peer and social network systems

Since about 2000, there has been an explosion of interest in peer-to-peer and ad-hoc networking. One of the seminal papers was The Eternity Service, which I presented at Pragocrypt 96. I had been alarmed by the Scientologists' success at closing down the penet remailer in Finland, and have more than once been threatened by lawyers who did not want me to comment on the security of their clients' systems. Yet the modern era only started once the printing press enabled seditious thoughts to be spread too quickly and widely to ban. But when books no longer exist as tens of thousands of paper copies, but as a file on a single server, will government ministers and judges be able to unpublish them once more? (This has since happpened to newspaper archives in Britain.) So I invented the Eternity Service as a means of putting electronic documents beyond the censor's grasp. The Eternity Service inspired second-generation censorship-resistant systems such as Publius and Freenet; one descendant of these early systems is wikileaks. Our main contribution nowadays lies in helping to maintain Tor, the anonymity service used by wikileaks and by many others.

But the biggest deal turned out to be not sedition, or even pornography, but copyright. Hollywood's action against Napster led to our ideas being adopted in filesharing systems. Many of these developments were described here, and discussed at conferences like this one. See also Richard Stallman's classic, The Right to Read. Many of the ideas in early peer-to-peer systems reemerged in the study of ad-hoc and sensor networks and are now spilling over into social networking systems.

My contributions since the Eternity paper include the following.


Reliability of security systems

I have been interested for many years in how security systems fail in real life. This is a prerequisite for building robust secure systems; many security designs are poor because they are based on unrealistic threat models. This work began with a study of automatic teller machine fraud, and expanded to other applications as well. It provides the central theme of my book. I also have a separate page on bank security which gathers together all our papers on fraud in payment systems with some additional material.

The papers on physical security by Roger Johnston's team are also definitely worth a look, and there's an old leaked copy of the NSA Security Manual that you can download (also as latex).


Robustness of cryptographic protocols

Many security system failures are due to poorly designed protocols, and this has been a Cambridge interest for many years. Some relevant papers follow.

Protocols have been the stuff of high drama. Citibank asked the High Court to gag the disclosure of certain crypto API vulnerabilities that affect a number of systems used in banking. I wrote to the judge opposing this; a gagging order was still imposed, although in slightly less severe terms than Citibank had requested. The trial was in camera, the banks' witnesses didn't have to answer questions about vulnerabilities, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who later joined us as a research student. They are of significant scientific interest, as well as being relevant to the rights of the growing number of people who suffer phantom withdrawals from their bank accounts worldwide. Undermining the fairness of trials and forbidding discussion of vulnerabilities isn't the way forward (press coverage by the Register and news.com).


Analysis and design of cryptographic algorithms

The attack on the hash function SHA made Tiger, which Eli Biham and I designed in 1995, a popular choice of hash function for a while. I also worked with Eli, and with Lars Knudsen, to develop Serpent – a candidate block cipher for the Advanced Encryption Standard. Serpent won through to the final of the competition and got the second largest number of votes. Another of my contributions was founding the series of workshops on Fast Software Encryption.

Other papers on cryptography and cryptanalysis include the following.


Information hiding (including Soft Tempest)

From the mid- to late-1990s, I did a lot of work on information hiding.


Security of Clinical Information Systems

There's a huge row brewing over the government's plans to centralise medical records; the cover story is improving care and giving us access to our records online while the real agenda is to sell our medical data to insurance companies and drug company researchers. This follows a big row under the last Government over the Summary Care Record, which centralises records and makes them available to hundreds of thousands of NHS staff. To see the history via our most recent blog posts, go here.

The NHS has a long history of privacy abuses. The previous prime minister's own medical records were compromised; the miscreant got off scot-free as it was not in the "public interest" to prosecute him. In another famous case, Helen Wilkinson had to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers. The minister assured the House that the libels had been removed; months later, they still had not been. Helen started www.TheBigOptOut.org to campaign for health privacy. They have been joined by medConfidential, Big Brother Watch and others.

Here are my most recent papers on the subject.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions, after the government argued that they needed it to catch doctors who misprescribed heroin. The police got their data, but they didn't catch Harold Shipman, and no-one was held accountable. The NHS slogan in 1995 was `a unified electronic patient record, accessible to all in the NHS'. The BMA campaigned against this, arguing that it would destroy patient privacy:

In 1996, the Government set up the Caldicott Committee to study the matter. Their report made clear that the NHS was already breaking confidentiality law by sharing data without consent; but the next Government just legislated (and regulated, and again) to give itself the power to share health data as the Secretary of State saw fit. (We objected and pointed out the problems the bill could cause; similar sentiments were expressed in a BMJ editorial, and a Nuffield Trust impact analysis, and BMJ letters here and here. Ministers claimed the records were needed for cancer registries: yet cancer researchers work with anonymised data in other countries – see papers here and here.) There was a storm of protest in the press: see the Observer, the New Statesman, and The Register. But that died down; the measure has now been consolidated as sections 251 and 252 of the NHS Act 2006, the Thomas-Walport review blessed nonconsensual access to health records (despite FIPR pointing out that this was illegal – a view later supported by the European Court). A government committee, the NHS Information Governmance Board, was set up oversee this lawbreaking, and Dame Fiona is being wheeled out once more. Centralised, nonconsensual health records not only contravene the I v Finland judgement but the Declaration of Helsinki on ethical principles for medical research and the Council of Europe recommendation no R(97)5 on the protection of medical data.

Two health IT papers by colleagues deserve special mention. Privacy in clinical information systems in secondary care describes a hospital system implementing something close to the BMA security policy (it is described in more detail in a special issue of the Health Informatics Journal, v 4 nos 3-4, Dec 1998, which I edited). Second, Protecting Doctors' Identity in Drug Prescription Analysis describes a system designed to de-identify prescription data for commercial use; although de-identification usually does not protect patient privacy very well, there are exceptions, such as here. This system led to a court case, in which the government tried to stop its owner promoting it – as it would have competed with their (less privacy-friendly) offerings. The government lost: the Court of Appeal decided that personal health information can be used for research without patient consent, so long as the de-identification is done competently.

Resources on what's happening in the USA – where the stimulus bill has made medical privacy a very live issue – include many NGOs: Patient Privacy Rights may have been the most influential, but see also EPIC, the Privacy Rights Clearinghouse, the Citizens' Council on Health Care, the Institute for Health Freedom. and CDT. Older resources include an NAS report entitled For the Record: Protecting Electronic Health Information, a report by the Office of Technology Assessment, a survey of the uses of de-identified records for the DHHS, and a GAO report on their use in Medicare. For information on what's happening in the German-speaking world, see Gerrit Bleumer's web page. As for the basic science, the American Statistical Association has a good collection of links to papers on inference control, also known as statistical security – the protection of de-identified data.


Public policy issues

I chair the Foundation for Information Policy Research, the UK's leading Internet policy think tank, which I helped set up in 1998. We are not a lobby group; our enemy is ignorance rather than the government of the day, and our mission is to understand IT policy issues and explain them to policy makers and the press. Here's an overview of the issues as we saw them in 1999, and a video of how we saw them ten years later in 2008. Some highlights of our work follow.

My pro-bono work includes sitting on Cambridge's Research Ethics Committee and our Board of Scrutiny. From 2003-10 I say on Council, our governing body. I stood for election because of a proposal that most of the intellectual property generated by faculty members – from patents on bright ideas to books written up from lecture notes – would belong to the university rather than to its creator. There were already unpleasant incidents around IP (see for example Mike Clark's story here); to stop such things happening again, we founded the Campaign for Cambridge Freedoms. The final vote approved a policy according to which academics keep copyright but the University gets 15% of patent royalties. I got re-elected in 2006, and in my second term we won an important vote to protect academic freedom. For more, see my article from the Oxford Magazine, and my Unauthorised History of Cambridge University.

My CV is here. I'm a Fellow of the Churchill College, the Royal Society, the Royal Academy of Engineering, the Institution of Engineering and Technology, the Institute of Mathematics and its Applications, and the Institute of Physics. My h-index is tracked here. As for my academic genealogy, my thesis adviser was Roger Needham; his was Maurice Wilkes; then it runs back through Jack Ratcliffe, Edward Appleton, Ernest Rutherford, JJ Thomson, Lord Rayleigh, Edward Routh, William Hopkins, Adam Sedgwick, Thomas Jones, Thomas Postlethwaite, Stephen Whisson, Walter Taylor, Roger Smith, Roger Cotes, Isaac Newton, Isaac Barrow and Vincenzo Viviani to Galileo Galilei.

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)


My Book on Security Engineering

cover

The second edition is now out! You can order it from Amazon.com and Amazon.co.uk.

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves. My book has become the standard textbook and reference since it was published in 2001. You can download both the first and second editions without charge here.

Security engineering is not just concerned with infrastructure matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and about embedded systems such as automatic teller machines and burglar alarms. It's usually done badly: it often takes several attempts to get a design right. It is also hard to learn: although there were good books on a number of the component technologies, such as cryptography and operating systems, there was little about how to use them effectively, and even less about how to make them work together. Most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book was an attempt to help the working engineer to do better. As well as the basic science, it contains details of many applications – and lot of case histories of how their protection failed. It describes a number of technologies which aren't well covered elsewhere. The first edition was pivotal in founding the now-flourishing field of information security economics: I realised that the narrative had to do with incentives and organisation at least as often as with the technology. The second edition incoporates the economic perspectives we've developed since then, and new perspectives from the psychology of security, as well as updating the technological side of things.

More ...


Contact details

University of Cambridge Computer Laboratory
JJ Thomson Avenue
Cambridge CB3 0FD, England

E-mail: Ross.Anderson@cl.cam.ac.uk
Tel: +44 1223 33 47 33
Fax: +44 1223 33 46 78

I only referee for open publications, so I discard emails asking for reports for journals that sit behind a paywall. I also usually discard emails sent by people's secretaries: if you can't be bothered to email me yourself, then I can't be bothered to answer myself either.

If you want to do a PhD, please read our web pages first; we get lots of CV spam which we delete. I also discard emails that ask for internships; we can't employ overseas students on Tier 4 visas any more. If you're interested in coming as an undergrad, have a look at our video.