University of Cambridge Computer Laboratory foto

Ross Anderson

[Research] [Blog] [Videos] [Politics] [My Book] [Music] [Seminars] [Contact Details]

Snitches Get Stitches: On The Difficulty of Whistleblowing presents a game-theoretic model of whistleblowing, which is often a bit more complex than it seems. This won an award here where it was the last paper presented.

Sponge Examples: Energy-Latency Attacks on Neural Networks describes how to find inputs to neural networks that make them take a lot of time, or burn a lot of energy. They can be used to distract or to jam machine learning systems in a wide range of applications (blog press Schneier). This follows the The Taboo Trap, a mechanism to block adversarial machine learning attacks on energy-constrained devices (invited talk at AISEC 2019, older version of paper), and work on neural network compression (SysML talk). See also papers on transferability and adversarial reinforcement learning (accepted at DSN-DSML'20).

The Lecture Videos and Notes for my first-year undergraduate course on Software and Security Engineering are now online. I've made them available to everyone rather than restricting them to Cambridge students. Enjoy!

I'm writing a third edition of my Security Engineering textbook and putting draft chapters online for comment as I write them.

Measuring the Changing Cost of Cybercrime analyses what's changed in the world of cybercrime since we did the first systematic study seven years ago. Some crimes are up (cryptocrime) and others down (IP infringement), but the overall pattern's surprisingly stable despite huge changes in technology (blog)

timeline ...


I am Professor of Security Engineering at the Computer Laboratory. My research students are Alex Vetterl, Ilia Shumailov and Mansoor Ahmed; my postdocs are Maria Bada, Yi Ting Chua, Richard Clayton, Ben Collier, Franck Courbon, Ildiko Pete and Sergei Skorobogatov; I'm also collaborating with Robert Brady. Anh Vu is a research assistant. My former postdocs are Daniel Thomas, Alice Hutchings, Sergio Pastrana, David Modic, Sven Übelacker, Julia Powles, Ramsey Faragher, Sophie van der Zee, Mike Bond, Vashek Matyas, Steven Murdoch and Andrei Serjantov, while my former students Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho-Festin, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi, Richard Clayton, Jolyon Clulow, Hao Feng, Andy Ozment, Tyler Moore, Shishir Nagaraja, Robert Watson, Hyoungshick Kim, Shailendra Fuloria, Joe Bonneau, Wei-Ming Khoo, Rubin Xu, Laurent Simon, Kumar Sharad, Shehar Bano, Dongting Yu and Khaled Baqer have earned PhDs.

I taught two courses in 2019-20: the undergraduate course in Software and Security Engineering and the graduate course in Computer Security: Principles and Foundations. I also organise our security seminars.

My research topics include:

Sustainability of security

Our computers and communications use several percent of global energy, and have secondary costs too – particularly if you have to throw things away for lack of software updates. I also have a long-standing interest in energy management and have more recently been looking at the energy wasted by cryptocurrency mining and the prevention of wildlife crime. (Incidentally, this website is entirely static – no ads, trackers, javascript or even cookies. The estimated carbon cost per page view is 0.07g compared with over 2g for a typical commercial web page.)

Economics, psychology and criminology of information security

Incentives matter as much as technology for the security of large-scale systems. Systems break when the people who could fix them are not the people who suffer the costs of failure. So it's not enough for security engineers to understand cryptomathematics and the theory of operating systems; we have to understand game theory and microeconomics too. I pioneered the discipline of security economics which is starting to embrace privacy economics, security psychology and criminology too.

There are two relevant workshops I helped establish: Security and Human Behaviour workshop which brings together security engineers and psychologists, while the Workshop on Economics and Information Security is where you meet everyone working in security economics. hundred participants.

Peer-to-Peer and social network systems

One of the seminal papers in peer-to-peer systems was The Eternity Service, which I invented in response to growing Internet censorship, The modern era only started once the printing press enabled seditious thoughts to be spread too widely to ban. But when books no longer exist as tens of thousands of paper copies, but as a file on a single server, will courts be able to order them unpublished once more? (This has since happpened to newspaper archives in Britain.) So I invented the Eternity Service as a means of putting electronic documents beyond the censor's grasp. It inspired second-generation censorship-resistant systems such as Publius and Freenet; one descendant is wikileaks. But the killer app turned out to be not sedition, or even pornography, but copyright. Hollywood's action against Napster led to our ideas being adopted in filesharing systems; they are now re-emerging in the Internet of Things.

Work since the Eternity paper includes the following.

Reliability of security systems

I have been interested for many years in how security systems fail in real life; many security designs are poor because they are based on unrealistic threat models. I started with a study of ATM fraud, and expanded to other applications one after another. This provides a central theme of my book. I also have a separate page on bank security which gathers together all our papers on fraud in payment systems.

The papers on physical security by Roger Johnston's team are also definitely worth a look, and there's an old leaked copy of the NSA Security Manual that you can download (also as latex).

Robustness of cryptographic protocols

Many security system failures are due to poorly designed protocols, and this has been a Cambridge interest for many years. Some relevant papers follow.

Protocols have been the stuff of high drama. Citibank asked the High Court to gag the disclosure of certain crypto API vulnerabilities that affect a number of systems used in banking. I wrote to the judge opposing this; a gagging order was still imposed, although in slightly less severe terms than Citibank had requested. The trial was in camera, the banks' witnesses didn't have to answer questions about vulnerabilities, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who later joined us as a research student. They are of significant scientific interest, as well as being relevant to the rights of the growing number of people who suffer phantom withdrawals from their bank accounts worldwide. Undermining the fairness of trials and forbidding discussion of vulnerabilities isn't the way forward (press coverage by the Register and

Signal processing

Information is often hidden in apparently unrelated signals or even in what appears to be noise, and this can happen either by accident or design. The detection and manipulation of hidden information has many applications, and the field is being refreshed by the recent revolution in neural networks.

Cryptography, including quantum cryptography

Lots of people don't believe quantum crypto is practical. I don't believe the security proofs offered for entanglement-based quantum cryptosystems, because they assume that the strange behaviour observed in the Bell tests must result from nonlocal action. But it can also emerge from pre-existing long-range order. One explanation, advocated by Nobel prizewinner Gerard 't Hooft, is the cellular automaton interpretation of quantum mechanics; see his keynote talk at EMQM 2015. I have been working with Robert Brady to develop another line of inquiry.

In the 1990s I worked with Eli Biham and Lars Knudsen to develop Serpent – a candidate block cipher for the Advanced Encryption Standard. Serpent got the second largest number of votes.

Other papers on cryptography and cryptanalysis include the following.

Another of my contributions was founding the series of workshops on Fast Software Encryption.

Security of Clinical Information Systems

The safety and privacy of clinical systems have been a problem for years. Recent scandals include the Google DeepMind case (exposed by my then postdoc Julia Powles) where the Royal Free Hospital gave Google a million patients' records that they shouldn't have; and the affair where a billion records – basically all hospital care episodes since 1998 – were sold to 1200 firms worldwide, in a format that enabled many patients to be re-identified. It wasn't much better under the previous Labour government, which had a series of rows over thoughtless and wasteful centralisation. There is now an NGO, MedConfidential, which monitors and campaigns for health privacy.

The NHS has a long history of privacy abuses. Gordon Brown's own medical records were compromised while he was prime minister, but the miscreant got off scot-free as it was "not in the public interest" to prosecute him. In another famous case, Helen Wilkinson had to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers. The minister assured the House that the libels had been removed; months later, they still had not been. There is now an NGO set up specifically to campaign for health privacy,

Here are my most recent papers on the subject.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions, after the government argued that they needed it to catch doctors who misprescribed heroin. The police got their data, but they didn't catch Harold Shipman, and no-one was held accountable. The NHS slogan in 1995 was `a unified electronic patient record, accessible to all in the NHS'. The BMA campaigned against this, arguing that it would destroy patient privacy:

In 1996, the Government set up the Caldicott Committee to study the matter. Their report made clear that the NHS was already breaking confidentiality law by sharing data without consent; but the next Government just legislated (and regulated, and again) to give itself the power to share health data as the Secretary of State saw fit. (We objected and pointed out the problems the bill could cause; similar sentiments were expressed in a BMJ editorial, and a Nuffield Trust impact analysis, and BMJ letters here and here. Ministers claimed the records were needed for cancer registries: yet cancer researchers work with anonymised data in other countries – see papers here and here.) There was a storm of protest in the press: see the Observer, the New Statesman, and The Register. But that died down; the measure has now been consolidated as sections 251 and 252 of the NHS Act 2006, the Thomas-Walport review blessed nonconsensual access to health records (despite FIPR pointing out that this was illegal – a view later supported by the European Court). A government committee, the NHS Information Governmance Board, was set up oversee this lawbreaking, and Dame Fiona is being wheeled out once more. Centralised, nonconsensual health records not only contravene the I v Finland judgement but the Declaration of Helsinki on ethical principles for medical research and the Council of Europe recommendation no R(97)5 on the protection of medical data.

Two health IT papers by colleagues deserve special mention. Privacy in clinical information systems in secondary care describes a hospital system implementing something close to the BMA security policy (it is described in more detail in a special issue of the Health Informatics Journal, v 4 nos 3-4, Dec 1998, which I edited). Second, Protecting Doctors' Identity in Drug Prescription Analysis describes a system designed to de-identify prescription data for commercial use; although de-identification usually does not protect patient privacy very well, there are exceptions, such as here. This system led to a court case, in which the government tried to stop its owner promoting it – as it would have competed with their (less privacy-friendly) offerings. The government lost: the Court of Appeal decided that personal health information can be used for research without patient consent, so long as the de-identification is done competently.

Resources on what's happening in the USA include many NGOs: Patient Privacy Rights may have been the most influential, but see also EPIC, the Privacy Rights Clearinghouse, the Citizens' Council on Health Care, the Institute for Health Freedom. and CDT. Older resources include an NAS report entitled For the Record: Protecting Electronic Health Information, a report by the Office of Technology Assessment, a survey of the uses of de-identified records for the DHHS, and a GAO report on their use in Medicare. As for the basic science, see my book chapters on Boundaries and on Inference Control.

Public policy issues

I chair the Foundation for Information Policy Research, the UK's leading Internet policy think tank, which I helped set up in 1998. We are not a lobby group; our enemy is ignorance rather than the government of the day, and our mission is to understand IT policy issues and explain them to policy makers and the press. Here's an overview of the issues as we saw them in 1999, and a video of how we saw them ten years later in 2008. Some highlights of our work follow.

I served on Council, Cambridge University's governing body, 2003–10 and from 2015–18. I stood for election because of a proposal that most of the intellectual property generated by faculty members – from patents on bright ideas to books written up from lecture notes – would belong to the university rather than to its creator. To stop this, and to prevent further incidents like this one), we founded the Campaign for Cambridge Freedoms. The final vote approved a policy according to which academics keep copyright but the University gets a share of patent royalties. I got re-elected in 2006, and in my second term we won an important vote to protect academic freedom. For more, see my article from the Oxford Magazine, and my Unauthorised History of Cambridge University. From 2013-4 I was on our Board of Scrutiny. In my third term my main contribution was investigating the delays and cost overruns in a large construction project.

My CV is here while my h-index is tracked here. I'm a Fellow of Churchill College, the Royal Society, the Royal Academy of Engineering, the Institution of Engineering and Technology, the Institute of Mathematics and its Applications, and the Institute of Physics. I won the 2015 Lovelace medal; the interviews I did for that award are here, while my oral history interview transcript is here and an Academy video is here. As for my academic genealogy, my thesis adviser was Roger Needham; his was Maurice Wilkes; then it runs back through Jack Ratcliffe, Edward Appleton, Ernest Rutherford, JJ Thomson, Lord Rayleigh, Edward Routh, William Hopkins, Adam Sedgwick, Thomas Jones, Thomas Postlethwaite, Stephen Whisson, Walter Taylor, Roger Smith, Roger Cotes, Isaac Newton, Isaac Barrow and Vincenzo Viviani to Galileo Galilei.

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)

My Book on Security Engineering


The third edition is under way – you can read draft chapters on my book page.

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves. My book has become the standard textbook and reference since it was published in 2001. You can download both the first and second editions without charge here.

Security engineering is not just concerned with infrastructure matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and about embedded systems such as automatic teller machines and burglar alarms. It's usually done badly: it often takes several attempts to get a design right. It is also hard to learn: although there were good books on a number of the component technologies, such as cryptography and operating systems, there was little about how to use them effectively, and even less about how to make them work together. Most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book was an attempt to help the working engineer to do better. As well as the basic science, it contains details of many applications – and lot of case histories of how their protection failed. It describes a number of technologies which aren't well covered elsewhere. The first edition was pivotal in founding the now-flourishing field of information security economics: I realised that the narrative had to do with incentives and organisation at least as often as with the technology. The second edition incoporates the economic perspectives we've developed since then, and new perspectives from the psychology of security, as well as updating the technological side of things.

More ...

Highlights by year

2018 highlights include papers on what's wrong with bitcoin exchanges and how to trace stolen bitcoin; on making security sustainable; controlling side effects in mainstream C compilers; how protocols evolve and a gullibility metric. There's also an invited talk on privacy for tigers.

2017 highlights include Standardisation and Certification in the Internet of Things, an analysis for the EU of what happens when we get software everywhere, and which informed EU directive 2019/771 on the sale of goods; DigiTally, a prototype payment system we built to extend mobile phone payments to areas of less developed countries with no phone service, using a novel protocol; and a book chapter I wrote for EPIC.

2016 highlights include a new Android side channel; an investigation of the social externalities of trust; studies of when lying feels the right thing to do, of taking down websites to prevent crime and bank fraud reimbursement; and finally two papers on Brexit.

2015 highlights included Keys Under Doormats, on what's wrong with government attempts to mandate exceptional access to all our data; a Nuffield report on what happens to health privacy in a world of cloud-based medical records and pervasive genomics; a report on the emotional impact of Internet fraud; two papers on how to do lie detection using analysis of body motion; severe flaws in Android factory reset and mobile anti-virus apps; and a novel demonstration that the Bell test results can come from pre-existing long-range order as easily as from nonlocal action.

2014 highlights included papers on Chip and Skim describing pre-play frauds against EMV bank cards; Security protocols and evidence which explains how the systems needed to support proper dispute resolution just don't get built; Experimental Measurement of Attitudes Regarding Cybercrime, on how prosecutors and public opinion are out of step; The psychology of malware warnings, on how to get users to pay attention to risk; Privacy versus government surveillance, on network economics and international relations; and Why bouncing droplets are a pretty good model of quantum mechanics, which solves an outstanding mystery in physics.

2013 highlights included Rendezvous, a prototype search engine for code; a demonstration that we could steal your PIN via your phone camera and microphone; an analysis of SDN Authentication; and papers on quantum computing and Bell's inequality.

2012 highlights included a big report on Measuring the Cost of Cybercrime and a history of security economics; an attempt to kill the government's smart metering project; three papers on dynamic networks; and four papers on payment protocols: Chip and Skim: cloning EMV cards with the pre-play attack, How Certification Systems Fail, A birthday present every eleven wallets? and Social Authentication – harder than it looks. Finally, Risk and privacy implications of consumer payment innovation discusses both payment and economic issues.

2011 highlights included a major report on the Resilience of the Internet Interconnection Ecosystem which studies how an attacker might bring down the Internet; an updated survey paper on Economics and Internet Security which covers recent analytical, empirical and behavioral research; and Can We Fix the Security Economics of Federated Authentication? which explores how we can deal with a world in which your mobile phone contains your credit cards, your driving license and even your car key. What happens when it gets stolen or infected? (blog)

2010 highlights included a paper on why Chip and PIN is broken for which we got coverage on Newsnight and a best paper award (later, the banks tried to suppress this research). Other bank security work included a paper on Verified by VISA and another on the unwisdom of banks adopting proprietary standards. On the control systems front, we published papers on the technical security and security economics of smart meters, on their privacy, on their deployment and on key management for substations. I created a psychology and security web page and wrote a paper on putting context and emotion back in security decisions.

2009 highlights included Database State, a report we wrote about the failings of public-sector IT – many of whose recommendations were adopted by the government elected in 2010; The snooping dragon which explains how the Chinese spooks hacked the Dalai Lama in the run-up to the Peking Olympics; Eight Friends are Enough, which shows how little privacy you have on Facebook; and The Economics of Online Crime. There's a videos of a talk I gave on dependability at the IET, as well as a survey paper, the slides, and a podcast. Finally, I wrote an Unauthorised History of Cambridge University.

2008 highlights included a major study of Security Economics and European Policy for the European Commission; the second edition of my book "Security Engineering"; the discovery of serious vulnerabilities in Chip and PIN payment systems; an analysis of the failings of the Financial Ombudsman Service (see also a video from the World Economic Forum in November 2008); the FIPR submission to the Thomas-Walport Review; a piece on confidentiality in the British Journal of General Practice; three videos on privacy made by ARCH; and a video on surveillance. I started a Workshop on Security and Human Behaviour to bring together psychologists with economists and security engineers to work on deception and risk.

2007 highlights included technical papers on RFID and on New Strategies for Revocation in Ad-Hoc Networks (which explores when suicide attacks are effective); a paper on fraud, risk and nonbank payment systems I wrote for the Fed; and a survey paper on Information Security Economics (of which a shortened version appeared in Science). I was a special adviser to House of Commons Health Committee for their Report on the Electronic Patient Record. Finally, following the HMRC data loss, I appeared in the debate on Newsnight.

2006 highlights included technical papers on topics from protecting power-line communications to the Man-in-the-Middle Defence, as well as a major report on the safety and privacy of children's databases for the Information Commissioner. I ended the year debating health privacy with health minister Lord Warner.

2005 highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a big survey paper on cryptographic processors.

2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amendments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs.

Contact details

University of Cambridge Computer Laboratory
JJ Thomson Avenue
Cambridge CB3 0FD, England

Tel: +44 1223 33 47 33
Fax: +44 1223 33 46 78

I only write and referee for open publications, so I discard emails asking for reports for journals that sit behind a paywall.

By default, when I post a paper here I license it under the relevant Creative Commons license; you may redistribute it with attribution but not modify it. I may subsequently assign the residual copyright to an academic publisher.

If you want to do a PhD, please read our web pages first; we get lots of CV spam which we delete. I also discard emails that ask for internships; we can't employ overseas students on Tier 4 visas any more. If you're interested in coming as an undergrad, please have a look at our video.