Computer Laboratory

Course pages 2011–12

Computer Fundamentals

Computer Fundamentals is a four-lecture course, new for 2011/12, that is intended to provide elementary knowledge of computers and computer science.

Some students will already be familiar with much of the course content, but past experience has shown that the IA audience (which is composed of Computer, Natural and Social Scientists) is composed of a wide range of backgrounds. Therefore it is useful to ensure that all students have a grasp of the fundamentals before delving deeper.

The course is not examined explicitly since there is no exam question attributed to it on either of the IA CS papers. However, many of the concepts it touches on appear again in other courses, where they are open to examination.


The notes as provided in lecture one are available by clicking here.

The full set of annotated lecture notes are available here.

The examples sheet is available by clicking here.

Lecture 1: History

The first lecture was a meander through some of the history of a computer. It started late and I went slightly slower than planned so we covered up to the end of the EDSAC creation.

Some students were interested in learning more, so here's a list of the things mentioned so that they can be looked up:

  • Differential Analysers
  • Hilbert's decision problem
  • (Universal) Turing machines (Alan Turing)
  • Polish Bombes
  • Heath Robinsons and Colussus at Bletchley park
  • ENIAC and the draft report for EDVAC
  • Manchester Baby
  • EDSAC (Maurice Wilkes)

There are many resources on all of these topics, although some of the online sources are of dubious quality and sometimes contradictory. Memoirs of a computer poineer by Maurice Wilkes and Alan Turing: the enigma by Andrew Hodges are both very good reads. The latter is particularly good for providing the context for the war work.

If you are interested in Turing machines, there are many descriptions online. I rather like, especially the concrete examples section which provides some simple action tables. Note that the machine there is not a true Universal turing machine, since its action table is set beforehand and not read in from the tape .

Aside from general interest, the point of all this history is to set the scene and make you realise that computer science is firmly rooted in mathematics, although significant engineering effort was/is needed to realise computers. Today's computers appear to be unwieldy and complex, but in fact they are little different from the core ideas 70 years ago, and mapping them back to "maths machines" will help in understanding what is to come in later courses.

Lecture 2: Modern Systems/Fetch-Execute/Numbering Systems

The most important thing to take from this lecture is the fetch-execute cycle, closely followed by binary representations. The lecture covered up to the start of negative numbers.

Lecture 3: Negative Numbers/Compilers/Interpreters/Pointers

The two's complement technique for negative numbers is very important because it is the way most systems work today. You should be able to work in two's complement and show that addition is the same process for signed and unsigned numbers (other than the rules for spotting overflows and errors).

We also looked at compilers and interpreters and started to move towards higher level (more human readable/writable) languages. You should have a good appreciation of what a compiler is and why it is needed, as well as the merits and demerits of an interpreter.

We finished by introducing the notion of pointers (variables that hold memory addresses).

Lecture 4: Pointers/Programming Paradigms/Operating Systems

This lecture started by taking a fresh look at pointers (variables holding memory addresses). I used some basic (and somewhat dodgy) C code to demonstrate that we could jump around in memory and the potential problems this could cause. You should be familiar with the idea of a pointer and that their usage is risky (high efficiency gains, but also a greater chance of making a mistake).

We then moved on to consider imperative and functional programming. We saw that imperative languages represent programs using lots of state (variables in memory) and changing that state in various ways. Functional programs, however, do not allow us to change state, only create more of it. This forces us to view programming differently, in a way that directly maps to the maths we we know and love. This has many advantages (and some disadvantages). ML (a functional language) is your next stop in the CS curriculum.

We finishesd by discussing the need for operating systems and the ways in which computers fake running multiple programs concurrently. We postponed any discussion of Java until next term.