Software Tools & Die Abstract I am not a social scientist, nor to I play one on YouTube, but I work with a few social scientists, so here, I'll try to channel some of their approach. Modern software production is often a highly collaborative and even distributed business. A number of underpinning technologies have emerged such as internet repositories with version control (Github, SVN, Bitbucket) with secure, reliable access management, and documentation and release mechanisms, as well as group communication applications. Some of the tools emerged from a somewhat alternative social background (BitTorrent, jabber), but are now in mainstream use by software development groups within large companies, working on closed/proprietary software, as well as in open source development communities. Indeed, you could say that the cyberworld in which many citizens live today is just our work-a-day toolbox. These systems (mostly) provide very good logging capability, so that it is possible to identify who has "committed" which code (indeed, how much code is committed by each contributor, and also why (e.g. added feature, fixed bug, just general progress). In the software engineering research community, there have been published studies of these processes (both closed and open source), and one interesting outcome reproduced in several studies is that simple metrics (lines-of-code committed per unit time) bear no relation to value (quality of code, usefulness of features!). Another interesting feature is that in some communities, there are pseudonymous contributors, whereas in others, individuals wish to take credit. Neither is this correlated with whether there is an authority (Linux/Torvalds versus BSD) or an owner (Microsoft versus Xen). In this brief talk, I'll present this world, and briefly comment on its diversity. Some observations: This is important, as increasingly, everyday stuff is made of, or by software. Starting from computer hardware (processors, radio) being largely the product of cunning programmes that lay out logic more complex than any human could manage in a lifetime, with potentially mathematically proven correct behaviour; then radios (software radios are present in many smart phones and tablets today); then software defined networks and services; more interestingly, software can define what you print (from new door knobs, to entire new buildings); what chemical or drug is created; the appearance and physical behaviour of objects around you (simple car seats adapting to who sits in them to drive, to rooms changing lighting or arrangement). It is important because software is general. Alan Turing's great model of a general Turing Machine is "complete", in the sense that any and all are equivalent, and can execute any and all programmes (only subject to resource limitations). Hence a single device like a smart phone can run millions of apps. However, many organisations try to prevent this generality being available to the user, "stunning" features of devices to prevent applications running rings around business models. This is sometimes referred (in commercial terms) as a tussle space, where market forces play out and "free market" ideas meet special pleading. Cory Doctorow has written about this under the heading "The War on general Purpose Computing". http://postscapes.com/internet-of-things-award/2014/winners Software engineers spend most of their time bricklaying - putting line upon line and mortaring things together, much as a medieval artisan would have worked on Chartres Cathedral, or on plastering the wall and mixing the paints for the fresco painter. A small (vanishingly tiny ) part of the work is highly innovative and creative - there are three places this creativity shows up 1, thinking of the app in the first place 2. thinking of a new algorithm or data structure 3. debugging. Each is not mechanisable, and relies on intuition, art, insight and luck. Each is often un-noticed in the big scheme of things, which looks much more like a lot of book keeping than a great work, until you step back from the whole edifice (millions of lines of code making up a new air traffic control system, or a new social media cloud service) and see what it does. When it all goes horribly wrong (bugs) then finding the problem is like finding the cliched needle in a haystack. Software is sometimes referred to as being "brittle". Every line of code that is used is needed to be correct. This is not like the analog world, where small mistakes may not matter - think of a few errors in a knitting pattern or a performance of a piece of music - in many cases, they go un-noticed. Only occasionally does the whole Christmas jumper unravel, or the whole Orchestra come to a complete stop, unable to workout how to get there from here. This happens all to often if a single simple logical mistake is made somewhere in those 4 million lines... hence the analogy with the medieval cathedral builder or the fresco artist (lets say ,Francesco del Cross, for example) is similar -you get paid to produce the stuff. The amazing artistic result is often not really appreciated (until much later when the software company IPOs or is acquired - typically a good team might fetch 500,000$ per person - hence a 50 member company like Xensource (a computer lab startup in Cambridge a few years back) did very well to get 500M! So the creative step (so loved of patent supporters) is a) hard to spot b) often just a piece of maths c) frequently discovered in an un-remembered moment, rather than sweated over for years in trials in a lab. Although that can happen too. Type one (choosing the "killer app") is also very hit-and-miss - even after thinking through lots of neat ideas, only 1 in 10 of the things backed by seasoned investors (venture capitalists - VCs) actually succeed. 9 in 10 (probably more like 999 in a 1000, if you allow for self selection by people even dreaming of going to a VC) vanish without much trace.