Ross Anderson passed away in March 2024. (Obituaries)

We preserve here the content of his personal web space. If you notice any problems, please contact pagemaster@cl.cam.ac.uk.


University of Cambridge Computer Laboratory University of Edinburgh foto

Ross Anderson

[Research] [Blog] [Videos] [Politics] [My Book] [Music] [Seminars] [Contact Details]

Machine Learning needs Better Randomness Standards: Randomised Smoothing and PRNG-based attacks shows that the randomness tests long used to check random number generators for use in cryptographic key generation are inadequate for machine learning, where some applications make heavy use of random inputs about which very specific assumptions are made (accepted for Usenix 2024)

Defacement Attacks on Israeli Websites is a measurement study of attacks by Palestinian sympathisers on Israeli websites since the Hamas attack on Israel (CW blog). Getting Bored of Cyberwar is a similar study of how pro-Ukrainian hackers responded to the Russian invasion of their country by attacking Russian websites, and pro-Russian hackers then responded (AP SC Magazine The Record)

No Easy Way Out: the Effectiveness of Deplatforming an Extremist Forum to Suppress Hate and Harassment is a measurement study of the industry attempt to take down Kiwi Farms in 2022-23. This holds a number of practical lessons for people interested in online censorship, as well as raising legal and philosophical issues with the approach taken by the UK's Online Safety Bill (The Register; accepted for Oakland 2024)

The Curse of Recursion: Training on Generated Data Makes Models Forget asks what will happen to GPT-{n} once most of the content online is generated by previous models. We show that the use of model-generated content in training leads to irreversible defects in subsquent model generations as the tails of the original distributions disappear, leading to model collapse (The Atlantic, Wall Street Journal, New Scientist, Venture Beat, Business Insider blog)

One Protocol to Rule Them All? On Securing Interoperable Messaging analyses the EU DMA mandate for messaging systems interoperability. This will vastly increase the attack surface at every level in the stack (blog Register Schneier).

Threat Models over Space and Time: A Case Study of E2EE Messaging Applications shows how Signal Desktop and WhatsApp Desktop are insecure; an opponent with temporary access to your laptop, such as a border guard or an intimate partner, can make this access persistent.

Chat Control or Child Protection debunks the arguments used by the intelligence community that "because children" we needed the Online Safety Bill which gave Ofcom the power to mandate snooping software in your phone (blog). The same arguments were used to support the so-called Child Sex Abuse Regulation which thankfully failed in the European Parliament (blog evidence video) – our big policy win of 2023.

Cambridge forced me to retire in September 2023 when I turned 67, a policy of unlawful age discrimination against which we are campaigning. I am now 20% at Edinburgh and (officially) 20% at Cambridge. I'm teaching a course in Security Engineering at Edinburgh to masters students and fourth-year undergrads, and the lecture videos are now all online (as are the lecture videos and notes for my first-year undergrad course on Software and Security Engineering at Cambridge).

timeline ...

Research

The research students I advise are Bill Marino, Eleanor Clifford, Lawrence Piao, Jenny Blessing, Nicholas Boucher, Anh Viet Vu, and David Khachaturov. My RAs are Richard Clayton and Hridoy Dutta. I also work with Robert Brady. My former RAs are Sergei Skorobogatov, Lydia Wilson, Franck Courbon, Maria Bada, Yi Ting Chua, Ben Collier, Helen Oliver, Ildiko Pete, Daniel Thomas, Alice Hutchings, Sergio Pastrana, David Modic, Sven Übelacker, Julia Powles, Ramsey Faragher, Sophie van der Zee, Mike Bond, Vashek Matyas, Steven Murdoch, Andrei Serjantov and Alex Vetterl. My former students Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho-Festin, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi, Richard Clayton, Jolyon Clulow, Hao Feng, Andy Ozment, Tyler Moore, Shishir Nagaraja, Robert Watson, Hyoungshick Kim, Shailendra Fuloria, Joe Bonneau, Wei-Ming Khoo, Rubin Xu, Laurent Simon, Kumar Sharad, Shehar Bano, Dongting Yu, Khaled Baqer, Alex Vetterl, Mansoor Ahmed and Ilia Shumailov have earned PhDs.

I'm teaching three Cambridge courses in 2023-24: the undergraduate course in Software and Security Engineering and graduate courses in Computer Security and Cybercrime. I also organise our security seminars and help run the Cambridge Cybercrime Centre.

My research topics include:


Machine learning and signal processing

The detection and manipulation of patterns, both overt and covert, has many applications, and the field is being refreshed by the recent revolution in neural networks.


Sustainability of security

Our computers and communications use several percent of global energy, and have secondary costs too – particularly if you have to throw things away for lack of software updates. I also have a long-standing interest in energy management and have more recently been looking at the energy wasted by cryptocurrency mining and at the prevention of wildlife crime. (Incidentally, this website is entirely static – no ads, trackers, javascript or even cookies. The estimated carbon cost per page view is 0.07g compared with over 2g for a typical commercial web page.)


Economics, psychology and criminology of information security

Incentives matter as much as technology for the security of large-scale systems. Systems break when the people who could fix them are not the people who suffer the costs of failure. So it's not enough for security engineers to understand cryptomathematics and the theory of operating systems; we have to understand game theory and microeconomics too. I pioneered the discipline of security economics which is starting to embrace privacy economics, security psychology and criminology too.

There are two relevant workshops I helped establish: Security and Human Behaviour workshop which brings together security engineers and psychologists, while the Workshop on Economics and Information Security is where you meet everyone working in security economics. hundred participants.


Peer-to-Peer and social network systems

One of the seminal papers in peer-to-peer systems was The Eternity Service, which I invented in response to growing Internet censorship, The modern era only started once the printing press enabled seditious thoughts to be spread too widely to ban. But when books no longer exist as tens of thousands of paper copies, but as a file on a single server, will courts be able to order them unpublished once more? (This has since happpened to newspaper archives in Britain.) So I invented the Eternity Service as a means of putting electronic documents beyond the censor's grasp. It inspired second-generation censorship-resistant systems such as Publius and Freenet; one descendant is wikileaks. But the killer app turned out to be not sedition, or even pornography, but copyright. Hollywood's action against Napster led to our ideas being adopted in filesharing systems; they are now re-emerging in the Internet of Things.

Work since the Eternity paper includes the following.


Reliability of security systems

I have been interested for many years in how security systems fail in real life; many security designs are poor because they are based on unrealistic threat models. I started with a study of ATM fraud, and expanded to other applications one after another. This provides a central theme of my book. I also have a separate page on bank security which gathers together all our papers on fraud in payment systems.

The papers on physical security by Roger Johnston's team are also definitely worth a look; see also an old leaked copy of the NSA Security Manual.


Robustness of cryptographic protocols

Many security system failures are due to poorly designed protocols, and this has been a Cambridge interest for many years. Some relevant papers follow.

Protocols have been the stuff of high drama. Citibank asked the High Court to gag the disclosure of certain crypto API vulnerabilities that affect a number of systems used in banking. I wrote to the judge opposing this; a gagging order was still imposed, although in slightly less severe terms than Citibank had requested. The trial was in camera, the banks' witnesses didn't have to answer questions about vulnerabilities, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who later joined us as a research student. They are of significant scientific interest, as well as being relevant to the rights of the growing number of people who suffer phantom withdrawals from their bank accounts worldwide. Undermining the fairness of trials and forbidding discussion of vulnerabilities isn't the way forward (press coverage by the Register).


Cryptography, including quantum cryptography

Lots of people don't believe quantum crypto is practical. I also don't believe the security proofs offered for entanglement-based quantum cryptosystems, because they assume that the strange behaviour observed in the Bell tests must result from nonlocal action. But it can also emerge from pre-existing long-range order. One explanation, advocated by Nobel prizewinner Gerard 't Hooft, is the cellular automaton interpretation of quantum mechanics; see his keynote talk at EMQM 2015. I have done some work with Robert Brady to develop another line of inquiry.

In the 1990s I worked with Eli Biham and Lars Knudsen to develop Serpent – a candidate block cipher for the Advanced Encryption Standard. Serpent got the second largest number of votes.

Other papers on cryptography and cryptanalysis include the following.

Another of my contributions was founding the series of workshops on Fast Software Encryption.


Security of Clinical Information Systems

The safety and privacy of clinical systems have been a problem for years. Recent scandals include the Google DeepMind case (exposed by my then postdoc Julia Powles) where the Royal Free Hospital gave Google a million patients' records that they shouldn't have; and the care.data affair where a billion records – basically all hospital care episodes since 1998 – were sold to 1200 firms worldwide, in a format that enabled many patients to be re-identified. It wasn't much better under the previous Labour government, which had a series of rows over thoughtless and wasteful centralisation. There is now an NGO, MedConfidential, which monitors and campaigns for health privacy.

The NHS has a long history of privacy abuses. Gordon Brown's own medical records were compromised while he was prime minister, but the miscreant got off scot-free as it was "not in the public interest" to prosecute him. In another famous case, Helen Wilkinson had to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers. The minister assured the House that the libels had been removed; months later, they still had not been. There is now an NGO set up specifically to campaign for health privacy, medConfidential.org.

Here are my most recent papers on the subject.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions, after the government argued that they needed it to catch doctors who misprescribed heroin. The police got their data, but they didn't catch Harold Shipman, and no-one was held accountable. The NHS slogan in 1995 was `a unified electronic patient record, accessible to all in the NHS'. The BMA campaigned against this, arguing that it would destroy patient privacy:

In 1996, the Government set up the Caldicott Committee to study the matter. Their report made clear that the NHS was already breaking confidentiality law by sharing data without consent; but the next Government just legislated (and regulated, and again) to give itself the power to share health data as the Secretary of State saw fit. (We objected and pointed out the problems the bill could cause; similar sentiments were expressed in a BMJ editorial, and a Nuffield Trust impact analysis, and BMJ letters here and here. Ministers claimed the records were needed for cancer registries: yet cancer researchers work with anonymised data in other countries – see papers here and here.) There was a storm of protest in the press: see the Observer, the New Statesman, and The Register. But that died down; the measure has now been consolidated as sections 251 and 252 of the NHS Act 2006, the Thomas-Walport review blessed nonconsensual access to health records (despite FIPR pointing out that this was illegal – a view later supported by the European Court). A government committee, the NHS Information Governmance Board, was set up oversee this lawbreaking, and Dame Fiona is being wheeled out once more. Centralised, nonconsensual health records not only contravene the I v Finland judgement but the Declaration of Helsinki on ethical principles for medical research and the Council of Europe recommendation no R(97)5 on the protection of medical data.

Two health IT papers by colleagues deserve special mention. Privacy in clinical information systems in secondary care describes a hospital system implementing something close to the BMA security policy (it is described in more detail in a special issue of the Health Informatics Journal, v 4 nos 3-4, Dec 1998, which I edited). Second, Protecting Doctors' Identity in Drug Prescription Analysis describes a system designed to de-identify prescription data for commercial use; although de-identification usually does not protect patient privacy very well, there are exceptions, such as here. This system led to a court case, in which the government tried to stop its owner promoting it – as it would have competed with their (less privacy-friendly) offerings. The government lost: the Court of Appeal decided that personal health information can be used for research without patient consent, so long as the de-identification is done competently.

Resources on what's happening in the USA include many NGOs: Patient Privacy Rights may have been the most influential, but see also EPIC, the Privacy Rights Clearinghouse, the Citizens' Council on Health Care, the Institute for Health Freedom. and CDT. Older resources include an NAS report entitled For the Record: Protecting Electronic Health Information, a report by the Office of Technology Assessment, a survey of the uses of de-identified records for the DHHS, and a GAO report on their use in Medicare. As for the basic science, see my book chapters on Boundaries and on Inference Control.


Public policy issues

I've been involved over the years with academic freedom, and with digital rights more generally.

I chair the Foundation for Information Policy Research, the UK's leading Internet policy think tank, which I helped set up in 1998. We are not a lobby group; our enemy is ignorance rather than the government of the day, and our mission is to understand IT policy issues and explain them to policy makers and the press. We had a conference for our 25th anniversary in 2023 (blog), another for our 20th in 2018 (blog), and here are the issues as we saw them in 2008 and 1999. Some highlights of our work follow.

I served on Council, Cambridge University's governing body, 2003–10 and from 2015–18. I stood for election because of a proposal that most of the intellectual property generated by faculty members – from patents on bright ideas to books written up from lecture notes – would belong to the university rather than to its creator. To stop this, and to prevent further incidents like this one), we founded the Campaign for Cambridge Freedoms. The final vote approved a policy according to which academics keep copyright but the University gets a share of patent royalties. I got re-elected in 2006, and in my second term we won an important vote to protect academic freedom. For more, see my article from the Oxford Magazine. From 2013-4 I was on our Board of Scrutiny. In my third term my main contribution was investigating the delays and cost overruns in a large construction project.

Since then the culture wars came to Cambridge. Should our university require us to treat foolish or obnoxious colleagues with "respect", or just with "tolerance"? Our VC demanded "respect" but we called a free speech vote and academics voted decisively for tolerance instead. See Varsity, Newsweek, the FT, the Spectator, the Mail, the Sunday Times, the Times Higher Education Supplement, the Cambridge Student, the Cambridge Radical Feminist Network, Stephen Fry – and the Minister of State for Universities.

Our latest campaign is against Cambridge's policy of forcing academics to retire at 67, an outdated policy to which only Cambridge and Oxford cling; Oxford's version was found illegal in March 2023. Our campaign page is here.

My CV is here while my h-index is tracked here. I'm a Fellow of Churchill College, the Royal Society, the Royal Academy of Engineering, the Institution of Engineering and Technology, the Institute of Mathematics and its Applications, and the Institute of Physics. I won the 2015 Lovelace medal; the interviews I did for that award are here, while my oral history interview transcript is here and an Academy video is here. As for my academic genealogy, my thesis adviser was Roger Needham; his was Maurice Wilkes; then it runs back through Jack Ratcliffe, Edward Appleton, Ernest Rutherford, JJ Thomson, Lord Rayleigh, Edward Routh, William Hopkins, Adam Sedgwick, Thomas Jones, Thomas Postlethwaite, Stephen Whisson, Walter Taylor, Roger Smith, Roger Cotes, Isaac Newton, Isaac Barrow and Vincenzo Viviani to Galileo Galilei. For context, see my Unauthorised History of Cambridge University

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)


My Book on Security Engineering

cover

The third edition is now on sale – you can read sample chapters on my book page.

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves. My book has become the standard textbook and reference since it was published in 2001. You can download both the first and second editions without charge here; the third edition will become free from 2024.

Security engineering is not just concerned with infrastructure matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and increasingly about embedded systems such as payment terminals and burglar alarms. It's usually done badly, so it often takes several attempts to get a design right. It's also hard to learn: although there were good books on a number of the component technologies, such as cryptography and operating systems, there was little about how to use them effectively, and even less about how to make them work together. Most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book was an attempt to help the working engineer to do better. As well as the basic science, it contains details of many applications – and lot of case histories of how their protection failed. It describes a number of technologies which aren't well described elsewhere. The first edition was pivotal in founding the now-flourishing field of information security economics: I realised that the narrative had to do with incentives and organisation at least as often as with the technology. The second edition incoporated the economic perspectives we've developed since then, and new perspectives from the psychology of security, as well as updating the technological side of things. The third edition is an update for the new world of phones, cloud services and social media; it tackles the problems raised by cars and medical devices such as the interaction of security with safety, and the costs of long-term patching; it also adds a huge amount about modern threat actors, from the cybercrime ecosystem to what we learned about state capabilities from the Snowden leaks and elsewhere.

More ...


Highlights by year

2022 highlights include ExtremeBB, a database we collect of extremist postings to support research by political scientists, criminologists and others; CoverDrop which lets a newspaper build an end-to-end encrypted messenger into its app for whistleblowers; a paper on Chat Control or Child Protection for the latest round of the crypto wars; a study of the failures of security proofs; and two developments of Bad Characters and Trojan Source – one showing how these techniques easily mislead search engines, while the other mapping the impulse response of the vulnerability disclosure ecosystem.

2021 highlights include Bad characters and Trojan source, of which the first broke all large language models and the second all computer languages; two adversarial machine-learning papers, on data ordering attacks and markpainting; an analysis of cybercrime ventures as startups; and Bugs in our Pockets, the latest round in the Crypto Wars.

2020 highlights include sponge attacks and nudge attacks on machine-learning systems, along with work on adversarial reinforcement learning and on decoding smartphone sounds with a voice assistant. But my main project in 2020 was writing a third edition of my Security Engineering textbook.

2019 highlights include an acoustic side channel on smartphones, one paper on whistleblowing and two papers on blocking adversarial machine learning. The big paper was on Measuring the Changing Cost of Cybercrime; since we did the first systematic study seven years ago, the patterns changed surprisingly little despite a huge changed in technology. Finally I gave an invited talk at 36C3 on the sustainability of safety, security and privacy.

2018 highlights include papers on what's wrong with bitcoin exchanges and how to trace stolen bitcoin; on making security sustainable; controlling side effects in mainstream C compilers; how protocols evolve and a gullibility metric. There's also an invited talk on privacy for tigers.

2017 highlights include Standardisation and Certification in the Internet of Things, an analysis for the EU of what happens when we get software everywhere, and which informed EU directive 2019/771 on the sale of goods; DigiTally, a prototype payment system we built to extend mobile phone payments to areas of less developed countries with no phone service, using a novel protocol; and a book chapter I wrote for EPIC.

2016 highlights include a new Android side channel; an investigation of the social externalities of trust; studies of when lying feels the right thing to do, of taking down websites to prevent crime and bank fraud reimbursement; and finally two papers on Brexit.

2015 highlights included Keys Under Doormats, on what's wrong with government attempts to mandate exceptional access to all our data; a Nuffield report on what happens to health privacy in a world of cloud-based medical records and pervasive genomics; a report on the emotional impact of Internet fraud; two papers on how to do lie detection using analysis of body motion; severe flaws in Android factory reset and mobile anti-virus apps; and a novel demonstration that the Bell test results can come from pre-existing long-range order as easily as from nonlocal action.

2014 highlights included papers on Chip and Skim describing pre-play frauds against EMV bank cards; Security protocols and evidence which explains how the systems needed to support proper dispute resolution just don't get built; Experimental Measurement of Attitudes Regarding Cybercrime, on how prosecutors and public opinion are out of step; The psychology of malware warnings, on how to get users to pay attention to risk; Privacy versus government surveillance, on network economics and international relations; and Why bouncing droplets are a pretty good model of quantum mechanics, which solves an outstanding mystery in physics.

2013 highlights included Rendezvous, a prototype search engine for code; a demonstration that we could steal your PIN via your phone camera and microphone; an analysis of SDN Authentication; and papers on quantum computing and Bell's inequality.

2012 highlights included a big report on Measuring the Cost of Cybercrime and a history of security economics; an attempt to kill the government's smart metering project; three papers on dynamic networks; and four papers on payment protocols: Chip and Skim: cloning EMV cards with the pre-play attack, How Certification Systems Fail, A birthday present every eleven wallets? and Social Authentication – harder than it looks. Finally, Risk and privacy implications of consumer payment innovation discusses both payment and economic issues.

2011 highlights included a major report on the Resilience of the Internet Interconnection Ecosystem which studies how an attacker might bring down the Internet; an updated survey paper on Economics and Internet Security which covers recent analytical, empirical and behavioral research; and Can We Fix the Security Economics of Federated Authentication? which explores how we can deal with a world in which your mobile phone contains your credit cards, your driving license and even your car key. What happens when it gets stolen or infected? (blog)

2010 highlights included a paper on why Chip and PIN is broken for which we got coverage on Newsnight and a best paper award (later, the banks tried to suppress this research). Other bank security work included a paper on Verified by VISA and another on the unwisdom of banks adopting proprietary standards. On the control systems front, we published papers on the technical security and security economics of smart meters, on their privacy, on their deployment and on key management for substations. I created a psychology and security web page and wrote a paper on putting context and emotion back in security decisions.

2009 highlights included Database State, a report we wrote about the failings of public-sector IT – many of whose recommendations were adopted by the government elected in 2010; The snooping dragon which explains how the Chinese spooks hacked the Dalai Lama in the run-up to the Peking Olympics; Eight Friends are Enough, which shows how little privacy you have on Facebook; and The Economics of Online Crime. There's a videos of a talk I gave on dependability at the IET, as well as a survey paper, the slides, and a podcast. Finally, I wrote an Unauthorised History of Cambridge University.

2008 highlights included a major study of Security Economics and European Policy for the European Commission; the second edition of my book "Security Engineering"; the discovery of serious vulnerabilities in Chip and PIN payment systems; an analysis of the failings of the Financial Ombudsman Service (see also a video from the World Economic Forum in November 2008); the FIPR submission to the Thomas-Walport Review; a piece on confidentiality in the British Journal of General Practice; three videos on privacy made by ARCH; and a video on surveillance. I started a Workshop on Security and Human Behaviour to bring together psychologists with economists and security engineers to work on deception and risk.

2007 highlights included technical papers on RFID and on New Strategies for Revocation in Ad-Hoc Networks (which explores when suicide attacks are effective); a paper on fraud, risk and nonbank payment systems I wrote for the Fed; and a survey paper on Information Security Economics (of which a shortened version appeared in Science). I was a special adviser to House of Commons Health Committee for their Report on the Electronic Patient Record. Finally, following the HMRC data loss, I appeared in the debate on Newsnight.

2006 highlights included technical papers on topics from protecting power-line communications to the Man-in-the-Middle Defence, as well as a major report on the safety and privacy of children's databases for the Information Commissioner. I ended the year debating health privacy with health minister Lord Warner.

2005 highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a big survey paper on cryptographic processors.

2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amendments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs.


Contact details

University of Cambridge Computer Laboratory
JJ Thomson Avenue, Cambridge CB3 0FD, England

E-mail: Ross.Anderson@cl.cam.ac.uk
School of Informatics, University of Edinburgh
10 Crichton Street, Edinburgh, EH8 9AB

E-mail: Ross.J.Anderson@ed.ac.uk

I only write and referee for open publications, so I discard emails asking for reports for journals that sit behind a paywall.

By default, when I post a paper here I license it under the relevant Creative Commons license; you may redistribute it with attribution but not modify it.

I can no longer admit PhD students for Cambridge, because of forthcoming mandatory retirement; so if you want to do a PhD, please read the relevant web pages. I still admit PhD students at Edinburgh.