Economics and Security Resource Page
Do we spend enough on keeping ‘hackers’ out of our computer
systems? Do we not spend enough? Or do we spend too much? For that matter, do
we spend too little on the police and the army, or too much? And do we spend
our security budgets on the right things?
The economics of security is a hot and rapidly growing field of research. More
and more people are coming to realise that security failures are often due to
perverse incentives rather than to the lack of suitable technical protection
mechanisms. (Indeed, the former often explain the latter.) While much recent
research has been on ‘cyberspace’ security issues — from
hacking through fraud to copyright policy — it is expanding to throw
light on ‘everyday’ security issues at one end, and to provide new
insights and new problems for ‘normal’ computer scientists and
economists at the other. In the commercial world, as in the world of diplomacy,
there can be complex linkages between security arguments and economic ends.
This page provides links to a number of key papers, conferences, the home pages of
active researchers, relevant books, and other resources. Complementary pages include Alessandro
Acquisti's privacy economics page, my security psychology page,
Jean Camp's bibliography, and job ads for security
Our annual bash is the Workshop on
Economics and Information Security; the next one will be in Boston on June
3-4 2019. See below for links to past workshops, for
all the workshop papers to date, and for other conferences with some security
- Tyler Moore and I have written a series of survey papers on security
economics. The latest (2011) appeared as a tech report Economics and
Internet Security: a Survey of Recent Analytical, Empirical and Behavioral
Research and later as a book chapter. A
previous survey paper, Information
Security Economics – and Beyond, appeared in various versions from
2006 to 2009. There was a short survey in
Science in late 2006; a version for
economists at Softint in January 2007; a version for
security engineers at Crypto in August 2007 (see slides); a book
chapter; and finally an archival journal
version in Phil
Trans Roy Soc A (Aug 2009).
Online Security Risks was one of the early pieces, and is still a good
introduction. Hal Varian shows how a range of problems, from bank fraud to
distributed denial-of-service attacks, result when the incentives to avoid
abuse are poorly allocated. An analysis of cash machine
fraud, for example, showed that banks in countries with strong customer
rights suffered less fraud; complaints could not be ignored or brushed aside,
so they took more care than in countries where it was harder for fraud victims
- Why Information
Security is Hard – An Economic Perspective was the paper that got
information security people thinking about the subject. I showed how economic
analysis explains many phenomena that security researchers had previously found
to be pervasive but perplexing. Why do mass-market software products such as
Windows contain so many security bugs? Why are their security mechanisms so
difficult to manage? Why for that matter are so many specialist security
products second-rate, with bad ones driving good ones out of the market? Why is
it hard for people to use security for competitive advantage – and how
might they? Why are government evaluation schemes, such as the Orange Book and
the Common Criteria, so bad? For that matter, why do government agencies
concerned with information warfare concentrate on offense rather than defense,
even now that the Cold War is over?
abundance and pervasive computing by Andrew Odlyzko was an early paper to
point out the economic and social limits on security technology – if a
boss's secretary cannot forge his signature, a digital security system is as
likely to subtract value as add it.
resilience of the Internet interconnection ecosystem is a major report we
wrote for ENISA and which has been adopted as European policy. Here is the full
report (238 pages) and, for the busy, the 31-page executive
summary. We documented how the Internet actually works in practice, as
opposed to in theory; we investigates how network operators negotiate peering
and transit, what goes wrong, how they deal with failures and where the
incentives for resilience are inadequate.
- Cars, Cholera and Cows:
The Management of Risk and Uncertainty is a classic paper by John Adams on
why organisations (and in particular governments) tend to be more risk-averse
than rational economic considerations would dictate. One of the mechanisms is
adverse selection: the people who end up in risk management jobs tend to be
more risk-averse than average.
Commerce: Who Carries the Risk of Fraud? (by Nick Bohm, Ian Brown and Brian
Gladman) documents how many banks have seen online banking, and information
security mechanisms such as cryptography and digital signatures, as a means of
dumping on their customers many of the transaction risks that they previously
bore themselves in the days of cheque-based and even telephone banking.
the Internet looks at the incentives facing virus writers,
software vendors and computer users. Its author Douglas Barnes asks
what policy initiatives might make computers less liable to infection.
Psychology and Sociology of Security by Andrew Odlyzko discusses a
number of ways in which cultural factors undermine the formal
assumptions underlying many security systems, and gives some insights
from evolutionary psychology; for example, we have specialised neural
circuits to detect cheating in social situations.
- Adverse Selection
in Online 'Trust' Certifications by Ben Edelman shows that websites with
the Trust-e seal of approval are much
more likely to be malicious than uncertified websites. Crooks have a greater
incentive to buy certification than honest merchants, so if the vetting
process isn't strict enough your certification scheme can easily end up
certifying the reverse of what it seems to.
- Economics of
Malware: Security Decisions, Incentives and Externalities is an OECD survey
of the misaligned incentives that drive the malware business, as perceived by
multiple stakeholders. It was published together with another survey on
malware prepared by the members of the OECD Working Party on Information
Security and Privacy (WPISP)
Economics of Privacy
See also Alessandro
Acquisti's privacy economics page.
- Towards a
positive theory of privacy law argues that the politics of privacy are
difficult because the economics are too. Markets can have a pooling equilibrium
(where everyone agrees on privacy) but when this breaks you may move to a
separating equilibrium where people prefer disclosure to prove they have not too
much to hide. Discrimination can work in unexpected ways; for example, many
African-Americans favour criminal record disclosure as without it an employer
may fear you're one of the 30% who's been in jail; so a total lack of privacy
may favour the other 70%. in the marketplace in general, privacy laws tend to
shift surplus from firms (who are few and organised) to the richer consumers
(who are less so). But sophisticated consumers (who can game the system to some
extent) may benefit from separating rather than pooling equilibria, and such
people may also be more active politically. Political activists tend to be
extraverts. So for a mix of market and behavioural reasons, legislatures may
provide less than the socially optimal level of privacy. General-purpose privacy
laws (as in Europe) can help, as they may prevent markets unravelling into
separating equalibria in the first place.
- The question Is Patient
Data Better Protected in Competitive Healthcare Markets? has a negative
answer; US cities with competitive as opposed to monopoly hospital provision
have more data breach reports, suggesting that patients undervalue security as
it's less visible than other hospital attributes.
Resources, Capabilities and Cultural Values: Links to Security Performance and
Compliance finds strong correlations between functional capabilities
(prevention and audit) and compliance in the healthcare industry, but not with
security: the better the audit, the worst the security! Similarly, when the
authors looked at cultural aspects, they found that collaboration was correlated
positively with compliance but not with security.
- Miguel Malheiros asks Would You
Sell Your Mother‘s Data? reasoning that the amount of data people will
disclose should depend on the reason for the request as well as the
sensitivity; they found that disclosure was inversely correlated with
sensitivity but that explanations made no difference. They conclude that TV
license and council tax payment history might be usable for applicants with
‘thin’ credit histories, but indices of social capital may not be.
- The privacy
economics of voluntary over-disclosure in Web forms investigates why people
volunteer more information than they need to online; one factor is self-esteem,
of "being a good person". But mandating some fields, especially sensitive ones,
reduces voluntary over-disclosure on others.
Privacy Landscape: Product Differentiation on Data Collection by Sören
Preibusch and Joe Bonneau tests the hypotheses that web sites differentiate on
privacy (H1); sites offering free services differentiate less (H2); that sites
which differentiate can charge cheaper prices (H3); and that monopolists are
less privacy friendly (H4). They looked at 130 web sites in three service and
two goods categories, plus ten monopolies. H1, H2 and H4 are supported but H3
isn't; in fact it's the other way round, with the more privacy-friendly sites
having cheaper prices. So there is no trade-off between getting a good deal and
keeping data private; privacy does not come at a premium, except when you have
Aspects of Personal Privacy is an early discussion by Hal Varian of how
market mechanisms might solve privacy problems, while Richard Posner's Orwell versus
Huxley: Economics, Technology, Privacy, and Satire touches on a number of
economic aspects of privacy and security technology.
Economics and Price Discrimination tackles one of the thorniest
market-failure problems. Why is privacy being eroded so rapidly, despite many
people saying they care about it? Andrew Odlyzko's analysis puts much of the
blame on differential pricing. Technology is increasing both the incentives and
the opportunities for this. From airline yield management to complex and
constantly changing software and telecomms prices, differential pricing is
economically efficient – but increasingly resented by consumers. His
Unsolvable Privacy Problem and its Implications for Security Technologies
develops the argument to personalised pricing.
prices on purchase history, by Alessandro Acquisti and Hal Varian, analyses
the market conditions under which first-degree price discrimination will
actually be profitable for a firm.
- Privacy and
Rationality: Preliminary Evidence from Pilot Data, by Alessandro Acquisti
and Jens Grossklags, studies the specific problem of why people express a high
preference for privacy when interviewed but reveal a much lower preference
through their behaviour both online and offline.
Privacy, and Crime by Alessandro Acquisti and Catherine Tucker studies the
effect of the publication of a list of gun permit holders in Memphis,
Tennessee. This was publicised in February 2009 by a shooting incident in a
shopping mall, and gives an insight into people's tradeoff between privacy and
security. Would knowing which neighbourhoods in Memphis had the most guns deter
crime? It did indeed, with a significant drop in burglaries for about 15 weeks.
- An End
to Privacy Theater: Exposing and Discouraging Corporate
Disclosure of User Data to the Government, by Chris Soghoian, discusses how
the pricing of law enforcement access to stored data, and other incentives,
might be used to minimise privacy invasion by the state.
- Why we can't
be bothered to read privacy policies – models of privacy economics as a
lemons market, by Tony Vila, Rachel Greenstadt, and David Molnar, examines
why many consumers fail to think of future price discrimination when giving
information to merchants.
- In Opt In Versus Opt
Out: A Free-Entry Analysis of Privacy Policies, Jan Bouckaert and Hans
Degryse compare the competitive effects of three customer privacy policies
– anonymity, opt-in and opt-out. Under certain assumptions, opt out is
the socially preferred privacy regime: the availability in the market of
information about the buying habits of most customers, rather than a few
customers, helps competitors to enter the market.
- Who Signed Up for the
Do-Not-Call List?, by Hal Varian, Fredrik Wallenberg and Glenn Woroch,
analyses the FCC's telephone-sales blacklist by district. Privacy means
different things to different population groups, but this raises further
questions. For example, educated people are more likely to sign up, as one
would expect: but is that because rich households get more calls, because they
value their time more, or because they understand the risks better? In Financial Privacy for
Free?, Alessandro Acquisti and Bin Zhang apply a similar analysis to credit
reporting. In Is There a
Cost to Privacy Breaches? Alessandro Acquisti, Allan Friedman and Rahul
Telang look at the effect on companies' stock prices of reported breaches of
their customers' privacy.
Property Rights & Efficiency: The Economics of Privacy as Secrecy, by
Benjamin Hermalin and Michael Katz criticises the Chicago school view that
more information is better (if collected costlessly), and argues that privacy
can be efficient even when there is no `taste' for privacy per se. The authors
develop a general model which also challenges the Varian view that privacy
could be achieved by simply giving individuals property rights in information
need to ban information transmission or use. The flow of information between
trading partners can reduce ex-post trade efficiency when the increase in
information does not lead to symmetrically or fully informed parties.
- On the Economics
of Anonymity studies why anonymity systems are hard to sell, and points out
some of their novel aspects. For example, honest players want some level of
free-riding, in order to provide cover traffic. So equilibria can also be
novel, and the ways in which they break down can be complex. We also have to
consider a wider range of principals – dishonest, lazy, strategic,
sensitive, and myopic – than in most of the markets that economists try
to model. Anonymity Loves
Company: Usability and the Network Effect continues this analysis to show
when a user will prefer a weak but popular anaonymity system over a strong but
- When 25 Cents is
too much: An Experiment on Willingness-To-Sell and Willingness-To-Protect
Personal Information by Jens Grossklags and Alessandro Acquisti provides a
nice exposition of the difference between willingness to pay and willingness
to accept, with the latter usually being dramatically higher, and discuss how
this gap can be applied to understand the privacy paradox.
- In Health
Disclosure Laws and Health Information Exchanges, Idris Adjerid and
colleagues studied HIE data against state privacy laws; 7 US states are proHIE
but require consent while 11 are proHIE and don't. Three each are pro-HIE and
pro-privacy, while 26 have no relevant laws. A panel analysis showed that the
regime most likely to promote HIEs was the first, namely proHIE law plus a
consent requirement. The effect is even stronger in states with pre-existing
privacy laws, and it's robust to other health IT adoption measures, breadth of
coverage and even other healthcare characteristics. The author concludes that
the only HIE promoting initiatives that worked were those that included privacy
requirements. The main limitation is the size of the dataset, with only about
- In Privacy, Network
Effects and Electronic Medical Record Technology Adoption, Amalia Miller
and Catherine Tucker provide a time-series analysis of health IT adoption
rates compared to health privacy laws in various US states.
- The Effect of
Online Privacy Information on Purchasing Behavior: An Experimental Study by
Janice Tsai, Serge Egelman, Lorrie Cranor and Alessandro Acquisti shows that by
making information about website privacy policies more accessible and salient
it is possible to get shoppers to pay more attention to it and even to pay a
premium for privacy.
- Conformity or
Diversity: Social Implications of Transparency in Personal Data Processing
by Rainer Boehme studies whether making information widely available about the
bases on which decisions are taken about individuals will lead to more
conformity (because, in the absence of information asymmetries and strategic
interaction with others, the optimal behaviour becomes mainstream) or diversity
(as in the absence of transparency, individuals are herded together by
uncertainty and fear). He presents a model of how preferences and signaling
behaviour might interact.
Economics of Privacy is a literature survey by Kai-Lung Hui and Ivan
The Information Security Business
Architecture and Smartphone Privacy: There’s A Price for That by Serge
Egelman and colleagues notes that 73% of smartphone malware uses SMS while only
3% of genuine apps do; but it’s a bother to deal with install-time permissions
and fewer than 20% do so. Serge surveyed 483 US residents to see if they’d pay
more for a newsreader that didn’t ask for location or record audio: 212
cost-sensitive subjects chose the cheapest and 120 chose the most private. In a
second experiment, he ran an auction to see how much 353 Android users would
need to give an app access to GPS, contacts or photos; the only significant
effect was on contacts.
- The Underground Economy of Fake Antivirus Software
reports data from 21 servers including tools, affiliate networks and financial
figures. Affiliates are paid to infect computers and servers reported
conversion rates around 2%, netting the server owners eight-figure incomes.
Refund rates were between 3% and 9%, and there is clear evidence of collusion
between and payment processors – one of whom was taking an 18% fee for a
“high-risk merchant account” and was giving the operator helpful
advice and tech support, while setting a chargeback threshold over which the
merchant would be terminated. The authors suggest that crooked payment
processors might be the weakest link and argue that VISA and MasterCard should
stop doing business with them.
- In Encryption
and Data Loss, Amalia Miller and Catherine Tucker report the surprising and
important result that hospitals who adopt encryption end up having more
breaches, not less! The data breach notification laws that give encryption
exemptions don't explain it all, either. It seems that encryption software
makes staff careless, and the damage outweighs the actual protection the
- In An
Organizational Learning Perspective on Proactive vs. Reactive investment in
Information Security, Juhee Kwon and Eric Johnson studied the 281 healthcare
firms reporting data breaches from the HIMSS database of 2,386 health
organisations and found that the effectiveness of security measures is
reduced if they're forced by regulation rather than adopted voluntarily; and
that external breaches affect investment while internal ones don't. They
suggest that regulators require organisations to spend a certain proportion of
their IT budget on security rather than telling them what to spend it on..
- In Inglorious
Installers: Security in the Application Marketplace, Jonathan Anderson,
Joseph Bonneau and Frank Stajano explore the transparency and confinement of
application installers across twelve application markets. The clustering
suggests it's about markets not technology! Factors include externalities and
asymmetric information; in addition, the cost of evaluating code depends on who
does it. The key tool may be branding: firms like Symbian and recently Apple
vet applications and expose their own brand in return for market power. The
Android market by contrast is a Wild West.
- In Information Governance:
Flexibility and Control through Escalation and Incentives, Xia Zhao and
Eric Johnson model the over- and under-entitlement to information that arise in
firms due to agency and path-dependence. They argue that to align employees'
interests with those of the firm, employers should use rewards as well as
penalties, and allow staff to escalate their own access rights when needed
within this framework.
Will Outsourcing IT
Security Lead to a Higher Social Level of Security? reports a study done by
Brent Rowe for the DHS on whether outsourcing improves or undermines security.
He concludes that it depends on what's outsourced (auditing, vulnerability
testing, monitoring, insurance, implementation or even system management). Most
firms outsource at least one service, and how much they buy in is sector
dependant; also, the more they outsource, the less their overall security
spend. Decisions also depend on scale economies and network effects.
- In Models and Measures for
Correlation in Cyber-Insurance, Rainer Boehme and Gaurav Kataria
examine the effects of local versus global correlation on insurance markets.
They show that in many economically important cases (such as globally
correlated risks from the worldwide spread of a worm or virus) there may be no
market solution, as the insurer's cost of safety capital becomes too high.
- In Privacy
Insurance contracts: Modeling and their use in Managing Risk in ICT firms,
Athanassios Yannacopoulos and colleagues explore insurance market failures due
to asymmetric information, and provide a random tulity model for assessing
insurance against privacy failures.
- In How and
Why More Secure Technologies Succeed in Legacy Markets: Lessons from the
Success of SSH, Nicholas Rosasco and David Larochelle discuss why many
security products failed and a few, like SSH, succeeded – in its case it
provided added non-security benefits to
users. In Bootstrapping
the Adoption of Internet Security Protocols, Andy Ozment and Stuart
Schechter provide a model for the adoption of a security service in the face of
network externalities and discuss how bundling might be used to help roll out
- The Economic
Impact of Role-Based Access Control is a study commissioned by the US
National Institute of Standards and Technology study to assess the economic
impact of an investment they made in promoting role-based access control. It
appears to be the first serious study that uses the return on investment to
assess research in the field.
Consequences of Sharing Security Information (by Esther Gal-Or and and
Economics Perspective on the Sharing of Information Related to Security
Breaches (by Larry Gordon), analyse the incentives that firms have to share
information on security breaches within the context of
the ISACs set up by the US government.
Models developed for trade associations and research joint ventures can be
applied to work out optimal membership fees and other incentives.
- Kevin Soo Hoo's
thesis was a first attempt to bring some econometrics to the field. It
looks at what countermeasures might be most cost-effective, given the FBI data.
He also has an article
analysing the return on security investment, which he puts at an unexciting
17-21 percent. (See press
There is also
government guide to doing risk assessment and cost-benefit analysis.
- The economic
cost of publicly announced information security breaches: empirical
evidence from the stock market, by Katherine Campbell, Larry Gordon,
Marty Loeb and Lei Zhou, provides an analysis of the effect of security
scares on share prices. There is a highly significant negative market
reaction for information security breaches involving unauthorized access
to confidential data, but no significant reaction when the breach does
not involve confidential information. Thus stock market participants
appear to discriminate across types of breach.
Economics of vulnerabilities
- Empirical Analysis
of Data Breach Litigation by Sasha Romanosky, David Hoffman and Alessandro
Acquisti analyses what data breaches lead to federal litigation and to actual
settlements. Litigation seems to be working as a means of resolving cases that
do actual harm; financial and medical firms need to pay the most attention. Good
data handling practice is the best way of avoiding being sued, and providing
free credit reporting to victims is the best way of mitigating damages when
things do go wrong.
- Software Security
Economics: Theory, in Practice by Stephan Neuhaus and Bernhard Plattner
criticises the standard vulnerability models. They analysed 292 mozilla
vulnerabilities, 66 in Apache httpd and 21 in Apache Tomcat, using a one-year
moving average, and found a better fit from a model with a reservoir of bugs and
different processes that add and remove them.
Plight of the Targeted Attacker in a World of Scale asks: Where are the
missing attacks? We have 2 billion Internet users most of whom care
diddley-squat about security, and lots of sophisticated attacks – yet
life goes on. Why? Well, mass automated attacks are scaleable, with sublinear
costs, while targeted attacks involve people and aren't. The profit from the
former is driven to zero by price competition, and ditto the value of assets
successfully attacked. The latter needs high-value targets, and assets tend to
have a power-law distribution: 1.8% of people have above average wealth, while
with fame 2% are above average. So 98% of people are of no interest. The moral
is that optimal security investment depends on whether anyone's targeting you.
His paper Where
Do All the Attacks Go? develops this further with a more careful model of
large-scale attacks: attackers compete for the most vulnerable users, who get
attacked multiple times, while in this `bear-race' model, many slightly less
vulnerable users escape harm.
- In Sex,
Lies and Cyber-crime Surveys, Dinei Florencio and Cormac Herley point out
that surveys of cyber-crime suffer from the same statistical diffficulty as
surveys of sexual behaviour: the ratio between the mean and the median is high.
Men claim to have between 2.5 and 9 times as many partners as women, but if you
consider only men claiming 20 or fewer partners, the difference disappears;
it's an artefact of exaggerated and often rounded claims by a minority of men.
- In An
Empirical Analysis of Exploitation Attempts based on Vulnerabilities in Open
Source Software, Sam Ransbotham studies 883 vulnerabilities in 13101
software products, and finds that open source does have a significant positive
effect on the probability and volume of exploitation; exploits of open source
also spread farther and faster. His paper on The
Impact of Immediate Disclosure on Attack Diffusion and Volume looked at 400+
million alerts from 2006--7 from 960 clients that were matched to the National
Vulnerability Database and 1201 specific vulnerabilities; it found that although
immediate disclosure of a vulnerability increases the risk that it will be
exploited in at least one attack and the number of targets attacked but,
curiously, is associated with a small drop in total attack volume.
Mathematics of Obscurity: On the Trustworthiness of Open Source by Hermann
Haertig, Claude-Joachim Hamann and Michael Roitzsch models races where
defenders have to find all bugs before the attackers find one of them, and
presents a closed form solution. It turns out that on some realistic
assumptions, the attackers always have a window.
- Is finding security
holes a good idea?, Eric Rescorla argues that since large software products
such as Windows contain many security bugs, the removal of an individual bug
makes little difference to the likelihood that an attacker will find another
one later. But many exploits are based on vulnerability information disclosed
explicitly by researchers, or implicitly when manufacturers ship patches. He
therefore argues that, unless discovered vulnerabilities are correlated, it is
best to avoid vulnerability disclosure and minimise patching.
- In Optimal Policy for Software
Vulnerability Disclosure, Ashish Arora, Rahul Telang and Hao Xu
argue to the contrary. They produce a model in which neither instant
disclosure not non-disclosure is optimal; without disclosure, software
firms will have little incentive to fix bugs in later versions of
their products. Their model is based ona respresentative vulnerability
rather than on vulnerability statistics.
- In Impact of
Vulnerability Disclosure and Patch Availability – An Empirical
Analysis, Ashish Arora, Ramayya Krishnan, Anand Nandkumar, Rahul Telang,
and Yubao Yang present empirical data to support the model of the above
paper. While vendors respond quickly to rapid disclosure, disclosure does
increase the number of attacks; and the number of reported vulnerabilities does
decline over time. They also find that open source projects patch more quickly
than proprietary vendors, and large companies patch more quickly than small
- In Network
Security: Vulnerabilities and Disclosure Policy, Jay Pil Choi, Chaim
Fershtman and Neil Gandal model the conditions under which a company would
voluntarily disclose vulnerabilities in the absence of regulation, and in which
a mandatory disclosure policy might not necessarily be welfare improving: it
all depends on the proportion of customers who install updates.
the Application of Security Patches for Optimal Uptime by Steve Beattie,
Seth Arnold, Crispin Cowan, Perry Wagle, and Chris Wright, provides a
quantitative analysis of a practical security management problem – how
long should you wait before you apply a security patch? Pioneers end up
discovering problems with patches that cause their systems to break, but
laggards are more vulnerable to attack. In a typical case, a wait of between
ten and thirty days seems about right.
- Economics of Security
Patch Management, by Huseyin Cavusoglu, Hasan Cavusoglu and Jun Zhang,
compares liability and cost-sharing as mechanisms for incentivising vendors to
work harder at patching their software. It turns out that liability helps where
vendors release less often than optical, while cost-sharing helps where they
release more often. If you want to achieve better coordination at minimum
additional cost to the vendor, they should not be used together. Meanwhile, Competitive and Strategic
Effects in the Timing of Patch Release by Ashish Arora, Christopher Forman,
Anand Nandkumar and Rahul Telang shows that competition hastens patch release
even more than disclosure threat in two out of three studied strategies.
- Open and
Closed Systems are Equivalent (that is, in an ideal world) is a paper by
Ross Anderson that examines whether openness helps the attacker or the defender
more. He shows that under standard assumptions used in reliability growth
models, openness helps both equally. There remain many factors that can break
symmetry and cause one or the other to be better in practice, but one should
look for them in the ways a system departs from the standard assumptions.
- In Pricing
security: vulnerabilities as externalities Camp and Wolfram argue that
exploits are externalities, and that a market of vulnerabilities can increase
public welfare. Stuart Schechter's paper How to Buy Better
Testing: using competition to get the most security and robustness for
your dollar expands on this and his thesis, Computer Security
Strength and Risk: A Quantitative Approach develops this theme
in a lot more detail.
- In Bug Auctions: Vulnerability
Markets Reconsidered, Andy Ozment applies auction theory to analyse how
vulnerability markets might be run better, and how they might be exploited by
the unscrupulous. Then Michael Sutton and Frank Nagle's paper, Emerging Economic Models for
Vulnerability Research, described the operation of iDefense and Tipping
Point, two companies set up to purchase vulnerabilities on the market. Vulnerability
markets by Rainer Boehme provides a short survey of the whole field.
- In The legitimate
vulnerability market: the secretive world of 0-day exploit sales, Charles
Miller describes real-world experience in selling vulnerabilies, demonstrating
the perils of operating in the absence of mature vulnerability markets.
Relevant Theory Papers
Games in Online Advertising: Can Ads Help Secure the Web? by Nevena
Vratonjic, Jean-Pierre Hubaux, Maxim Raya and David Parkes presents a
game-theoretic model of what happens when ISPs meddle with advertising, by
screening subscriber traffic, selling data to ad companies, or even inserting
ads into web pages on the fly. Having constructed a multi-stage game between a
website and all ISPs, they used real data on web traffic to estimate
incentives. An ISP could make money out of the top 1000 websites, but the more
aggressive ISPs get, the more websites would be secured. In fact, if ISPs try
to divert revenue from the most popular sites these would be secured rapidly.
- In System
Reliability and Free Riding, Hal Varian discusses ways in which the defence
of a system can depend on the efforts of the defenders. Programming, for
example, might be down to the weakest link (the most careless programmer
introducing the fatal vulnerability) while the effectiveness of testing might
depend on the sum of everyone's efforts. There can also be cases where the
security depends on the efforts of an individual champion. These different
models have interesting effects on whether an appropriate level of defence can
be provided, and what policy measures are advisable.
Investment (Failures) in Five Economic Environments: A Comparison of
Homogeneous and Heterogeneous User Agents, by Jens Grossklags, Nicolas
Christin and John Chuang, extends this analysis to account for heterogeneity
among decision makers, for example with respect to size and frequency of loss.
- The economics of
information security investment, by Larry Gordon and Marty Loeb, suggests
that a firm may often prefer to protect those information sets with middling
vulnerability, rather than the most vulnerable (as that may be too expensive);
and that to maximise the expected benefit, a firm might only spend a small
fraction of the expected loss.
- On the Evolution of
Attitudes toward Risk in Winner-Take-All Games by Eddie Dekel and Suzanne
Scotchmer presents an evolutionary model of how winner-take-all conflicts such
as patent races (or for that matter battles for control of software standards)
select for risk-takers and lead to the extinction of risk-avoiders.
- A BGP-based Mechanism
for Lowest-Cost Routing, by Joan Feigenbaum, Christos Papadimitriou, Rahul
Sami and Scott Shenker, shows how combinatorial auction techniques can be used
(at least in theory) to provide distributed routing mechanisms that are proof
against strategic behaviour by one or more of the participants.
- Lawrence Ausubel's Ascending
Auctions with Package Bidding shows that certain types of combinatorial
auction can be solved efficiently if bidding is conducted through a trusted
proxy – a system that can be relied on to bid according to an agreed
- The Communication
Complexity of Efficient Allocation Problems, by Noam Nisan and Ilya Segal,
shows that although one can solve the allocation problem using strategy-proof
mechanisms, the number of bits that must be communicated grows exponentially;
thus in many cases the best practical mechanism will be a simple bundled
auction. The paper also suggests that if arbitrary valuations are allowed,
players can submit bids that will cause communications complexity problems for
all but the smallest auctions.
- Noam Nisan and Amir Ronen's seminal paper Algorithmic Mechanism
Design shows how distributed mechanisms can be designed that are
strategyproof, that is, participants cannot hope to gain an advantage by
cheating. This paper sparked off much recent research at the boundary between
theoretical computer science and economics.
- Two influential related papers by Geoffrey Heal and Howard Kunreuther on
security externalities extended ideas from information security economics to
much more general applications. Interdependent
Security discusses the many cases where my security depends on my
neighbour's – where worms can spread from one part of a comnpany to
another, fire from one apartment to another, and infection from one person to
another. In some cases there will be a temptation to free-ride off the efforts
of others, so it is hard to make security investment a dominant strategy. You
Can Only Die Once: Managing Discrete Interdependent Risks examines the more
general case and analyses the conditions under which various security problems
have equilibria that are not socially optimal.
Measuring Electronic Crime
the Cost of Cybercrime by Ross Anderson and colleagues tries to collate
defensible numbers, for the UK and the world, for the direct and indirect costs
of a number of types of cybercrime. It distinguishes between ‘genuine’
cybercrime like fake AV and stranded traveler scams; traditional crime that’s
now done online such as tax filing fraud; and ‘transitional’ offences such as
card fraud that existed before but whose modus operandi has changed radically.
The paper found that the indirect costs greatly exceed the direct costs for
genuine cybercrime, while for traditional crime the direct costs are higher and
for transitional crime they're about equal.
crime: a review of the evidence by Mike McGuire and Samantha Dowling is both
a literature review and a report of a victimisation study carried out for the
UK Home Office. In addition to the new material, it collates data from a series
of other surveys done in both the public and private sectors to get a view of
the real scale of technical online offences, fraud and theft, and sexual
offences against children. While online annoyance is common, with over 30% of
users reporting spam or malware in the last 12 months, actual losses are rarer,
with only 3% reporting being out of pocket as a result of a scam in that
ecrime in Crowd-sourced Labor Markets by Vaibhav Garg, Chris Kanich and
Jean Camp examines why PCs in some countries are more likely to be recruited
into botnets, and how legitimate and criminal businesses affect each other by
looking at the uptake of jobs on Mturk (which are mostly legitimate) versus
Freelancer (where half the jobs aren't). Widespread broadband and
English-language competence help Mturk, while weak law enforcement boosts the
uptake of jobs on Freelancer.
- Contagion in
Cybersecurity Attacks by Julian Williams and coilleagues models the mutual
excitation of vectors of attacks using a variant of the Aït-Sahalia model,
where a matrix links variables in a stochastic process and can cope with jumps
in variables’ values, on the assumption that their arrival can be modelled as a
Poisson process. They fitted this to multi-channel Dshield attack data from
2003-9 and found that while long-run correlations are low, the short-term
correlations are high; attacks are bunched.
the Internet for Porn? An Insight Into the Online Adult Industry by Gilbert
Wondracek and colleagues investigates the links between adult pay sites, link
collections, traffic brokers, search engines and redirector services. They
crawled 269,000 URLs from 35,000 domains, which they checked for malware (over
3%, or about five times as much as expected) and other standard exploits. The
authors aalso set up two porn websites and joined three traffic brokers, where
$160 bought 49,000 visitors, of whom more than 20,000 were vulnerable to at
least one known vulnerability. By comparison, pay-per-install sites charge $130
per 1000 US installs. The conclusion: although not all porn sites are crooked,
many are, and this paper describes a whole ecosystem of shady services.
- The Underground Economy
of Spam by Brett Stone-Gross and colleagues draws conclusions from the
databases found on seized cutwail botnet command and control servers. They
conclude that the gang mane $2-4m between June 2009 and AUgust 2010.
underground economy: priceless by Rob Thomas and Jerry Martin of Team Cymru
was the first paper to explore the underground economy from studying it
directly by monitoring IRC chat rooms. In recent years online criminals have
established an efficient division of labour, just like in Adam Smith's pin
factory. This paper explains how the villains' pin factory works.
- In An
Inquiry into the Nature and Causes of the Wealth of Internet Miscreants,
Jason Franklin, Vern Paxson, Adrian Perrig, and Stefan Savage provide a more
systematic analysis of the underground economy by studying IRC channels and
collecting a lot of data about online criminals' trade in social security
numbers, credit card numbers and other goodies.
Theft by Keith Anderson, Erik Durbin and Michael Salinger provides a survey
of the research literature relating to identity theft in the USA and presents
data from FTC surveys. Almost 4% of Americans were victims in 2005, with 0.8%
suffering the most serious forms such as having credit cards issued to others
in their names. Wealthier Americans were most at risk, and a significant
minority incurred nontrivial clean-up costs. Credit card fraud losses by banks
are stable at about 6 basis points — down from over 15 in 1992; losses by
merchants are much higher at 1.4% of turnover (down from 3.6% in 2000). The
authors also discuss various policy options such as liability shifting, breach
notification, credit freezes and stiff penalties. These are made complex by the
links between the economics of payment systems and of credit.
- The Impact
of Incentives on Notice and Take-down by Tyler Moore and Richard Clayton
compares a variety of notice and take-down regimes for removing content on the
Internet. They find that phishing is removed fastest, but the the banks are
much slower to remove mule-recruitment websites. It turns out that child sexual
abuse images are slowest of all to be removed, due to the division of
responsibility for removal along national lines.
- In Examining the
Impact of Website Take-down on Phishing, Tyler Moore and Richard Clayton
find wide variation in the effectiveness of the responses of different actors to
phishing, and empirically demonstrate the impact of attacker innovation in the
form of longer website lifetimes for rock-phish and fast-flux attacks.
Malicious Websites and the Underground Economy on the Chinese Web by Jianwei
Zhuge and colleagues presents data on the Chinese underground economy and
explains the different roles of miscreants there.
- In Crime Online: Cybercrime
and illegal innovation, Howard Rush, Chris Smith, Erika Kraemer-Mbula and
Puay Tang describe the specialisation that has accelerated the development of
online crime since about 2004, just as Adam Smith's pin factory epitomised the
same tendency in the late 18th century.
Information Security Regulation
- A study
of security economics in Europe, by Ross Anderson, Richard
Clayton, Tyler Moore and Rainer Boehme, was published by the European Network
and Information Security Agency. It applies security economics research to
synthesise a series of policy options for dealing with cyber risk and online
policy issues in Europe. A shorter
version (62 pages) appeared at WEIS 2008, and there's an even shorter (25
page) version entitled Security economics
and European Policy.
Role of Internet Service Providers in Botnet Mitigation: An Empirical Analysis
Based on Spam Data by Michel van Eeten and colleagues analysed 63 billion
spam from 138 unique sources,and confirmed that ISPs are the critical control
point: the top 200 ISPs controlled 60% of spam sources and the top 10,
30%. There are substantial differences in ISP effectiveness at botnet
mitigation – two orders of magnitude. They explored possible explanatory
factors and found that cable providers have less infection, along with whether
a country adheres to the London Action Plan. Education, regulation and
automation may explain some differences.
- A Modeling Internet-Scale Policies
for Cleaning up Malware by Steven Hofmeyr, Tyler Moore, Stephanie Forrest,
Benjamin Edwards, and George Stelle, builds on this and shows that if the top
0.2% of ASes were to clean up malware, this would have more effect that 30% of
them chosen at random. They build a model of selective blacking and cleanup,
which they validate against the maliciousnetworks.org dataset. However the
efficacy of a selctive blocking strategy would fall off rapidly if less then
three-quarters of the top 0.2% were to cooperate.
- In Assessing
Home Internet Users' Demand for Security: Will They Pay ISPs? Dallas Wood
and Brent Rowe used discrete-choice experiments to assess whether home users
might pay an ISP more for a secure service. Home users want ‘lower taxes
and more services’ – low fees, no compliance and no risk. In
explicit trade-offs, they will pay about $4 per month to avoid crashes or
disconnections, and over $6 to avoid identity theft. They'll only spend $2.94 to
avoid harm to others, but only 73c to avoid spending an hour complying with
Governments Clean-up Malware? is a paper by Richard Clayton assessing how
much a contractor might charge per PC if given a government ‘public
health contract’ to clean up infected PCs; this might be about $10, or
under a dollar a machine per year assuming 0.5% of the population use the
service every month.
- In Data
Breaches and Identity Theft: When is Mandatory Disclosure Optimal?, Sasha
Romanosky, Richard Sharp and Alessandro Acquisti analyse the net change of
social costs following breach-disclosure laws using a tort model of minimising
the cost of care plus the cost of a breach. The social cost converges on the
firm's optimal cost as the level of care increases, and disclosure seems more
appropriate than mandated standards.
- Do Data Breach
Disclosure Laws Reduce Identity Theft? by Sasha Romanosky, Rahul Telang and
Alessandro Acquisti studies the effects of the security breach disclosure laws
now in force in many US states, and concludes that the case for their
effectiveness has not been proven.
- In Reinterpreting the
Disclosure Debate for Web Infections, Oliver Day, Rachel Greenstadt and
Brandon Palmen provide another analysis of data on electronic crime, in this
case the distribution of malware on infected web hosts. They show a high
concentration of infected hosts at poor-performing ISPs, and find evidence of
attackers moving to previously untargeted ISPs as others clean up their act.
- Why the Security
Market has Not Worked Well is a chapter from a 1990 study by the NAS
Computer Science and Technology Board which provides an early analysis of the
`computer security problem'. It blames the rapid pace of technological (and
particularly architectural) change, the comparatively slow pace of government
market interventions (through procurement and evaluation programs), export
controls, a lack of consumer understanding of the risks, and the very limited
recourse that US customers have against vendors of faulty software.
Information Flow in the Information Security Market describes the efforts
of the US government over the last couple of decades to tackle a perceived
market failure in the security business – the lemons problem, whereby bad
products drove out good ones. The attempted fix was a government-sponsored
evaluation scheme (the Orange Book), but that was not without its own
- In The Economic
Impact of Regulatory Information Disclosure on Information Security
Investments, Competition, and Social Welfare, Anindya Ghose and Uday Rajan
discuss how the implementation of US legislation such as Sarbanes-Oxley,
Gramm-Leach-Bliley and HIPAA has placed a disproportionate burden on small and
medium sized businesses, largely through a one-model-fits-all approach to
compliance by the big accounting firms. They show how mandatory investment in
security compliance can have a number of unindented consequences including
distorting security markets and reducing competition.
- In The Potential
for Underinvestment in Internet Security: Implications for Regulatory
Policy, Alfredo Garcia and Barry Horowitz show that the gap between the
social value of ISPs, and the revenue at stake associated with their security
levels, is continuing to increase. If this continues, they argue, mandatory
security standards may become likely.
- The European Union has proposed a Network
Security Policy that sets out a common European response to
attacks on information systems. This starts using economic arguments
about market failure to justify government action in this sector. The
proposed solutions are rather familiar, involving everything from
consciousness raising to Common Criteria evaluations; but the use of
economic analysis could be significant for the future.
- The Center for Strategic and International Studies has a very good study of the risks of
cyber-terrorism which goes a long way to debunk the scaremongering and hype
about the vulnerability of critical infrastructures to digital attack.
- The Brookings Institute has published Pirates
of the ISPs: Tactics for Turning Online Crooks Into International Pariahs
suggesting that the USA start establishing cooperation with China on cybercrime
as a means of starting to build cooperation and international norms online; the
idea is that cyber-criminality should become taboo as an instrument of state
power, just as piracy at sea did in the nineteenth century.
- Back in 2002, the Institute published a short but very influential paper
on the economic effects of security interdependency.
and Security in Statecraft and Scholarship explains why a web search on
`economics' and `security' turns up few interesting documents on international
affairs. The two were considered closely linked until 1945; thereafter nuclear
weapons were thought to decouple national survival from economic power, while
the USA established a pattern of confronting the USSR over security, and Japan
and the EU over trade. This caused Washington bureaucrats to split into a
`security' camp and a `political economy' camp; academics studying
international relations followed suit. Bill Clinton started to get the
bureaucrats working together again from about 1995, but the academics are still
Copyright and Rights Management
Games by Nevena Vratonjic and colleagues models games in which websites bar
users who block ads (as Ars Technica did in 2010); the equilibria generally
depend on how much the users value the content, and to a lesser extent the cost
of blocking detection and the extent to which users hate ads. This framework
can help publishers maximise revenue, but it means profiling users more
closely, which may be bad for privacy.
- Matthew Hashim, Sandra Maximiano and Karthik Kannan's Information
Targeting and Coordination: An Experimental Study asks whether messages on
copyright infringement might be counterproductive, like those on teen drinking
which send the message that “everybody does it”. Experiments on
whether information feedback affected free riding found that random feedback (as
used now) how no effect while targeted treatments resulted in significantly
higher user contributions.
Paradox Revisited: An Empirical Analysis of File-Sharing Behaviour in P2P
Communities finds a positive correlation between the size of a BitTorrent
file-sharing community and the amount of content shared, despite a reduced
individual propensity to share in larger groups, and deduces from this that
file-sharing communities provide a pure (non-rival) public good. Forcing users
to upload results in a smaller caatalogue; but private networks provide both
more and better content, as do networks aimed at specialised communities.
- Felix Oberholzer-Gee and Koleman Strumpf's File-Sharing and Copyright
argues that file-sharing doesn't seem to harm overall social welfare in that
concert ticket sales have gone up by more than sales of recorded music have
fallen. The paper summarises much of the research and controversy kicked off by
an earlier paper of theirs, The Effect of
File Sharing on Record Sales -- An Empirical Analysis. That argued that
downloads do not do significant harm to the music industry: five thousand
downloads are needed to displace a single album sale, while high-selling albums
actually benefit from file sharing. A similar
paper by the Dutch government concluded that file sharing led to net welfare
gains; for every Euro the music industry lost, consumers gained two.
- A Cost
Analysis of Windows Vista Content Protection asks some hard questions about
whether the new security mechanisms in Vista are worth it, and to whom. It
suggests Microsoft is imposing large costs on hardware suppliers, under cover
of protecting Hollywood content, but in reality as a lock-in play to control
- It follows logically from the `Trusted Computing'
Frequently Asked Questions, which provided the first critical survey of
and Cryptography and
Competition Policy – Issues with `Trusted Computing' which developed
an economic analysis that first suggested that Microsoft stoods to gain much
more than Hollywood – with the quick win being to lock in users of
Microsoft Office more tightly, thus enabling its price to be raised (or cut
less) in the face of competition.
- Fetscherin and
and music: How do rights affect the download price? shows that the prices
of music tracks sold online are mostly determined by the rights granted to the
purchaser – including the right to burn, copy or export the music –
and also by the label and the location.
- Ivan Png's Copyright:
A Plea for Empirical Research attacks Oberholzer and Strumpf, citing six
other studies that did indeed show a negative correlation between downloads and
CD sales. It also examines the Eldred case and looks at the incentive effects
of copyright law on the production of movies.
- Yooki Park and Suzanne Scotchmer's Digital Rights
Management and the Pricing of Digital Products argues that DRM does not
have to be perfect – the cost of circumvention needn't be raised above
the monopoly price; that technical protection may still yield more revenue than
legal protection, as it may never expire; and that separate DRM systems may
yield higher prices than a shared system, because of the greater incentives
for, and effects of, circumvention. It also looks at how the structure of a DRM
consortium such as the TCG might promote, or inhibit, collusive behaviour among
- Hal Varian's New
Chips Can Keep a Tight Rein on Consumers provides a concise introduction to
the problems that strict usage control mechanisms create for innovation
policy. A certain level of reverse engineering for compatibility is an
important brake on the abuse of monopoly power, especially in information goods
and services markets whose incumbents try hard to manipulate switching costs by
- In Cruel, Mean or
Lavish?: Economic Analysis, Price Discrimination and Digital Intellectual
Property Jamie Boyle argues that the next target of the copyright lobby,
after cracking down on fair use, will logically be the doctrine of first sale:
the right to resell, lend, or even criticise a book (or film or software
product) will be increasingly limited by contract and by technical
means. Publishers may try to control their aftermarkets using arguments about
the economics of price discrimination.
- In The Law and
Economics of Reverse Engineering, Pam Samuelson and Suzanne Scotchmer
describe what may go wrong if some combination of technical and legal
restraints can be made to undermine the right to reverse engineer software
products so as to make other products compatible with them. It provides the
theoretical and scholarly underpinnings for much of the work on the
anti-competitive effects of the DMCA, copyright control mechanisms, and
information security mechanisms applied to accessory control
applications. There is also a shorter
paper that applies the lessons of the main paper to the DeCSS case.
Source Software Projects as User Innovation Networks expands on this. Eric
von Hippel shows how most of the innovations that spur economic growth are not
anticipated by the manufacturers of the platforms on which they are based; the
PC, for example, was conceived as an engine for running spreadsheets. If IBM
had been able to limit it to doing that, a huge opportunity would have been
lost. Furthermore, technological change in the IT goods and services markets is
usually cumulative. If security technology can be abused by incumbent firms to
make life harder for people trying to develop novel uses for their products,
this will create all sorts of traps and perverse incentives.
- In Security
and Lock-In: The Case of the U.S. Cable Industry, Tom Lookabaugh and Doug
Sicker discuss an existing case history of an industry's development being
affected by security-related technical lock-in. US cable industry operators are
locked in to their set-top-box vendors; and although they can largely negotiate
to offset the direct costs of this when committing to a suppler, the indirect
costs are large and unmanageable. In particular, innovation suffers. Cable is
falling behind other platforms, such as the internet, as the two platform
vendors don't individually have the incentives to invest in improving their
Computing, Peer-To-Peer Distribution, and the Economics of Pirated
Entertainment, by Stuart Schechter, Rachel Greenstadt and Mike Smith, shows
how trusted computing technology can aid the pirates as well as the Hollywood
guys. TC platforms will, if they perform as advertised, provide much more
robust platforms for hosting peer-to-peer file-swapping services; they will be
very much less vulnerable to the service denial attacks currently deployed by
the content industry against services such as gnutella, grokster and
- In Privacy
Engineering for Digital Rights Management Systems, Joan Feigenbaum, Michael
Freedman, Tomas Sander and Adam Shostack discuss why the economic motivations
of the various players lead to serious difficulties in deploying privacy
technology for DRM.
- The Case
Against Patents by Michele Boldrin and David Levine is the definitive
study of the economic effects of patents. There is no empirical evidence that
they serve to increase innovation and productivity. Innovation emerges in the
highly competitive-cooperative environment of a new industry cluster; while
weak patent systems may mildly increase innovation with limited side-effects,
they inevitably evolve into strong patent systems that retard innovation with
many negative side-effects, favouring old and stagnant industries.
Measurement of Attitudes Regarding Cybercrime discusses how prosecutors and
public opinion are out of step; the former consider protest crimes to be more
serious than crimes done for financial gain, while voters take the opposite
- Forensic Economics is an
emerging field whose researchers use statistical techniques to establish the
existence of hidden wrongdoing in a wide variety of fields such as backdated
executive stock options, racial bias in hiring, and builders skimping on
materials. This survey discusses over 100 different studies.
- Why do
Nigerian Scammers Say They are from Nigeria? Cormac Herley's answer is that
repelling false positives is more important than finding true ones, and the
initial email costs almost nothing. So the function of the word “Nigeria” is to
find people who’ve not been sensitised to the scam.
Sometimes Violates the Rule of the Organizations? by Toshihiko Takemura and
Ayako Komatsu found compliance in Japanese companies with many infosec policies
was low; they asked respondents whether their motivation was cash, pleasure,
moral behaviour, peer acceptance or self-realisation and the last two scored
highest. Myopic behaviour is common, and even staff who experience incidents
violate the rules, especially where they are comfortable with the risks
personally, but they do it less if they value their peers’ opinion of them
- The Effect of
Fraud Investigation Cost on Pay-Per-Click Advertising, by Min Chen and
colleagues, models the possible effect of improved audit on click fraud, which
has been estimated at 14.5% in Q3 2009 rising to 22.3% in Q3 2010. Her model
suggests that it depends on the extra cost to the service provider of
high-quality detection – if it’s low then both parties might have an
incentive to pay for audit and then to improve their technology.
and privacy implications of consumer payment innovation discusses what
threats to competition, privacy and payment security might arise as a result of
current innovation in payment systems. It argue that although fraud may
increase, so will welfare, so there's no reason to panic. For now, bank
supervisors should work on collecting better fraud statistics, so that if
there ever is a crisis the response can be well-informed. (slides)
Tussles in Federated Identity Management by Tyler Moore and Susan Landau
analyses identity platforms as two-sided markets that lead to four tussles.
First, who gets to collect and control transaction data? OpenID benefits ID
providers and users but not service providers, while Facebook gives service
providers much more. Providers who share the social graph are accepted at
significantly more sites. Second, who sets the authentication rules?
Time-to-market matters more than robustness, and established payment networks
may tolerate more fraud. Third, what happens when things go wrong? Different
error types, sizes and clarity lead to different strategies. The final tussle
is who gains by interoperability: this helps users and identity providers but
not service providers.
password thicket: technical and market failures in human authentication on the
web by Joe Bonneau and Soren Preibusch reports a survey over 150 websites
of password advice, length, recovery, probing prevention and guessing
prevention. Password overcollection is a tragedy of the commons; insecurity is
a negative externality, as attackers get passwords from weak sites and try them
elsewhere: Twitter recently forced a million users to reset their passwords
after such a password-reuse attack.
Security, and Money: Balancing the Risks, Benefits, and Costs of Homeland
Security by John Mueller and Mark Stewart documents the cost of overreaction
to terrorism since 9/11, as in excess of a trillion dollars: $360 billion more
for homeland security, $110 billion more for intelligence, private-sector
expenditures up by over $100 billion. These now exceed spending on all crime,
and there's also the cost of foreign terror-justified wars. Yet the terrorism
insurance premium on a $303m building is under $10,000 a year.
- Ross Anderson's and Shailendra Fuloria's paper On
the security economics of electricity metering examines problems with the
smart metering initiatives being pursued in the EU and the USA, which create
significant conflicts of interest between energy companies, governments and
customers. They make recommendations for mitigating these.
topology of covert conflict by Shishir Nagaraja and Ross Anderson examines
how the police can best target an underground organisation given some knowledge
of its patterns of communication, and they in turn might react, using a
framework combining ideas from network analysis and evolutionary game
theory. Nagaraja's The Economics of
Covert Community Detection and Hiding extended this work from active
attacks on networks to passive surveillance. Hyoungshick Kim and Ross Anderson's
Experimental Evaluation of Robustness of Networks studies best strategies
for both surveillance and countersurveillance in networks where the adversaries
have bounded resources. Finally, Wayne Baker and Robert Faulkner's The
Social Organisation of Conspiracy analyses how network topology works out in
real criminal conspiracies.
- In The Economics
of Mass Surveillance, George Danezis and Bettina Wittneben apply these
against just a few well-connected militant organisers can draw a surprising
number of members of a subversive organisation into the surveillance net.
- Closing the
Phishing Hole – Fraud, Risk and Nonbanks reports research
commissioned by the US Federal Reserve for their
Fe Conference on bank regulation. This paper identified speedy asset
recovery as the most effective deterrent to online fraud, which is made easier
by systems like Western Union that make the recovery of stolen funds more
- Nonbanks and
Risk in Retail Payments by Stuart Weiner, Richard Sullivan and Simonetta
Rosati followed up with an analysis of the role played by nonbanks in US
payment systems more generally; a very large part of the infrastructure is now
- In The Economics of
Digital Forensics, Tyler Moore explains how the interests of vendors
diverge from those of law enforcement. For example, mobile phone vendors prefer
proprietary interfaces, which makes data recovery from handsets difficult;
recovery tools exist only for the most common models. Criminals should buy
unfashionable phones, while the police should prefer open standards.
Proves Not to Work by Ben Laurie and Richard Clayton shows that the
spam-blocking schemes that rely on getting mail senders to perform some
computational task are unlikely to solve the spam problem: there are many
legitimate senders with less available compute power per message than many
spammers can obtain from the compromised hosts they use.
- In Modelling
Incentives for Email Blocking Strategies, Andrei Serjantov and Richard
Clayton analyse the incentives on ISPs to block traffic from other ISPs with
many infected machines, and back this up with data. They also show how a number
of existing spam-blocking strategies are irrational and counterproductive.
- In Inadvertent
Disclosure – Information Leaks in the Extended Enterprise, Eric
Johnson and Scott Dynes study inadvertent data leaks of sensitive information
(personal and corporate) through P2P file sharing, and also find that some
users are explicitly searching for sensitive documents leaked through such
- In Mental Models of
Computer Security Risks, Farzaneh Asgharpour, Debin Liu and Jean Camp show
that people's mental models of computer security risks vary substantially
according to their expertise in the subject. The models implicit in much of the
literature are different again. This diversity has implications for risk
- Evaluating the Wisdom of
Crowds in Assessing Phishing Websites by Tyler Moore and Richard Clayton
challenges the fashionable approach of turning decisions over to end
users on the Internet. Letting users vote on what websites are evil creates
many opportunities for abuse because of the huge variance in participation
- Risk in Networked
Information Systems by Robert Axelrod gives an overall summary of infosec
attack and defence with with a strong flavour of game theory and institution
design in its perspectives and an extended discussion, grounded in military
and intelligence history, of the resources available to mitigate the risks from
The event to aim for if you want to keep up with research in this field and get
to know people is WEIS
– the Workshop on the Economics of Information Security,
which happens every June. WEIS
2018 will be held in Innsbruck on 18-19 June.
These links give you access to all the conference papers.
The Security and Human Behaviour workshop brings security engineers together
with psychologists, behavioral economists and others. The 2016 workshop will be
in Harvard. See
Other relevant conferences include:
Community – Home Pages of People Interested in