Computer Hardware: How Computers Work

Dr David Greaves

Corpus Christi College


Computer Laboratory

University of Cambridge

Charles Babbage - The Inventor of the Computer

Born December 26, 1791 in Teignmouth, Devonshire, UK. Died in 1871, London. Known as the 'Father of Computing' for his contributions to the basic design of the computer through his Analytical Machine.

Babbage Difference Engine

A mechanical calculator with FIXED (hardwired) program.


Computers and calculators need storage to keep data.
Computers store their program as well.

Today, storage is in REGISTERS or MAIN MEMORY/STORE, or SECONDARY MEMORY/STORE. The Difference Engine only had registers.

Digital storage stores digits. Babbage used base 10, not binary.

Mechanical register with noisy settings of its wheels       Nicely aligned digit settings

Digital storage mechanisms need to be tolerant to noise: digits must recover from small jogs and move back to one of their nominal settings.

Carry wheels increment the next digit on overflow of the current digit
The little white wheels provide what Babbage called the 'carriage' - the means to increment the next-most-significant digit when a given digit overflows.

In early mechanical designs, there was generally one arithmetic units (ALU) per register. Key insight from Babbage: use just one main ALU served by a large, yet simple, data store.


A program is a list of instructions:

Four basic types: change a register, load/save register in memory, PC change, I/O.

Jacquard Loom

The loom is controlled with punched cards that describe the pattern to be woven.

  • You Tube Clip.       Solva Mill Clip.

    The card under the reader is the current point in the program (program counter or PC).

    There is no GOTO statement: the instructions are followed in sequence, but perhaps some could be predicated ?

    This card reads '300 FACTORIAL PROGRAM BY GREAVES ------ '.

    I created it using a sort of typewriter like this IBM 029 Card Punch.

    Babbage's Second Design: The Analytical Engine

    Babbage realised that having a dedicated adder for every register was overly complex in general (parallel vector processing is the speciality that uses this).

    He split the machine into:

    1. the Mill with three registers coupled with adders and other logic (sufficient to do long multipication and long division)
    2. and the Store of arbitrary size (going off to the right) with a linear array of simple registers for main data storage.
    3. Punched cards served the program and also could be read and written as secondary storage like today's disk drives and USB memory sticks.

  • Same basic design as all later electronic computers.

  • Unfortunately, the mechanical version was never and has never been built owing to expense.

    Electronic Era

    1947: EDVAC

    The EDVAC was an early electronic computer that had separate memories for program and data. Main memory held 1024 ten-digit decimal numbers.

    Stored Program Computation

    John Von Neumann - The Second Inventor of the Computer

    First Draft of a Report on the EDVAC (commonly shortened to First Draft) was an incomplete 101-page document written by John von Neumann and distributed on June 30, 1945. Program and data were to be put in the same type of memory. PDF.

    EDSAC: A stored-program computer.

    Prof Sir Maurice Wilkes,
    University of Cambridge, Computer Laboratory:


    Electronic Delay Storage Automatic Calculator (EDSAC) was an early British computer. The machine, having been inspired by John von Neumann's seminal First Draft of a Report on the EDVAC, was constructed by Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England. EDSAC was the first practical stored-program electronic computer.

    A smaller machine was built in Manchester by Alan Turing and his team. Both groups had secretly worked on code-breaking machines during the Wold War II.

    Alan Turing:

    EDSAC was supported by J. Lyons & Co. Ltd., a British firm, who were rewarded with the first commercially applied computer, LEO I, based on the EDSAC design. EDSAC ran its first programs on May 6, 1949, calculating a table of squares and a list of prime numbers.

    EDSAC could address 1024 main memory locations, though only 512 were initially implemented. Each contained 18 binary digits.

    Since then, really the only developments have been speed, size, electricity used and storage capacity: the design has not changed!.

    Electronic Computing Basics: The Telegraph

    Morse Code

    Morse code tapper used for telegraphs:

    Serial and Parallel Communication

    With eight wires, using binary on each line, you can send one BYTE at a time.

    A Relay Station

    Wire degrades the signal. Need a human or electro-mechanical relay every 50 or so miles.

    An electro-mechanical relay is a switch operated by an electro-magnet.

    It keeps our binary digits clean.


    Buffering (relaying) the telegraph signal:


    Viewing it as a logic circuit.

    Rather than many miles, the same ideas are used for sending information a few centimeters over a circuit board or a fraction of a millimeter over a silicon chip. We then need only one battery or power supply.

    Mechanical Teletype: Advantages: don't need to learn Morse Code, don't need to manually write it down at the receiving station, faster.

    Modern character set:

    Use an 8-bit byte for each character sent.

    Compatibility: You can still plug a 1950 mechanical teletype into a recent laptop using the RS-232 serial port.

    Only one wire in each direction is actually needed: so with a ground earth connection as well, a total of three pins are commonly used on the nine-pin serial connector.

    Gate Transfer Curves: Keep those digits clean!

    Plots of input and output voltage for the basic components.

    Characteristic shape: flat at the edges, sharply changing in the middle: keeps binary digits clean.

    Voltage meanings and margins (five volt system):

    How To Store One Bit Electrically

    Electro-mechanical implementation: Using a relay

    A latching-relay can hold one binary digit (one bit): it is a bi-stable or flip-flop.

    Alternative Implementation: Electronic

    Flip-flops made from two inverters in a ring are the main means of making registers in silicon chips today.

    The mechanical and electronic bistables are both volatile, meaning that information is lost when the power is removed.

    Speed of switching: relay is at least a million times slower than a transistor.

    Size of the switch: relay uses about 100 million times the substrate area.

    Besides bistables, most computers today store data as magnetic spots on the hard disk and static electricity charges in Flash and DRAM memories. These give a few orders of magnitude greater density than transistor bi-stables.

    A Logic Gate

    The simplest function we can compute is implemented by a basic logic gate.

    My logic gate is implemented with two relays: while they are both activated by logic ones on their inputs, the output becomes a logic one. On chips gates are made out of field-effect transistors and four transistors are used instead of two relays.

    Other Basic Gates

    There is also an XOR gate, and versions of AND and OR with inverted outputs called NAND and NOR.

    Adder Circuits

    A half adder tallies two inputs creating a binary number in the range 0 to 2.
        A  B  |   C  S
        0  0  |   0  0
        0  1  |   0  1
        1  0  |   0  1
        1  1  |   1  0 

    The full adder: made of two half adders:

    A full-adder tallies three inputs creating a binary number in the range 0 to 3.

    A full-adder tattoo:

    Larger adders can be made by cascading full adders:

    A three bit-adder is made from three full adders (six half adders).
       A3 A2 A1    B3 B2 B1  |   C4 S3 S2 S1
       1  0  1     0  1  0   |   0  1  1  1     5 + 2 = 7

    Three-bit adder schematic symbol (hides the details):

    Gate count: each gate is roughly 4 transistors. A three bit adders is 15 gates = 60 transistors.

    Moore's Law

    Moore's law describes an important trend in the history of computer hardware: that the number of transistors that can be inexpensively placed on an integrated circuit is increasing exponentially, doubling approximately every two years. The observation was first made by Intel co-founder Gordon E. Moore in a 1965 paper.

    The largest chips are about the size of a fingernail.

    There are five or six major chips in a Nintendo DS, and dozens of small ones.

    Multiplexor: Another Basic Component

    Making a transparent latch flip-flop

    Putting flip-flops together to make a register.

    (These flip-flops are edge-triggered, meaning that they sample their input data on the positive edge of the clock. Internally, they use a pair of transparent latches).

    Primary Storage/Main Memory: RAM - Random Access Memory

    Possible implementation of RAM using a linear array of registers:

    Dynamic RAM

    Larger arrays are made using dynamic technology, using electric charge on a capacitor to store a bit instead of the ring-pair of inverters: .

    Other important components: Counter and Clock

    The clock generates a regular series of transitions.

    A counter increments on each clock pulse.

    How to make a counter from an adder and a broadside register:

    With ten output wires, the count range is 0 to 4095.

    Schematic symbol hides the details: .

    Now We've Seen Enough to Build a Complete Computer

    Come back and look at these slides in your own time ...

    Nand To Tetris: The Elements of Computing Systems / Nisan & Schocken"

    Counter Connected to a RAM memory

    The counter output can index a RAM containing instructions.

    Put instructions in the RAM and it steps through them in sequence, like stepping through the Jaquard punch cards.

    A jumpable counter

    Here we add a multiplexor between the adder and the broadside register.

    This enables a new value to be seeded.

    One of the instructions can be a GOTO, that operates the multiplexor to select a branch destination.

    An Execution Unit with Control Unit

    The ALU is an arithmetic and logic unit: it can compute a variety of functions of its two inputs.

    Combining the Execution and Control Units

    The combination of control and execution units on a single chip is a called a microprocessor.

    When there are multiple on a chip the combination is called a core.

    A single bus connects the processor to the memory system. This has been called the Von-Neumann bottleneck. It is avoided in today's computers by dedicating a large fraction of the processor chip to cache memories.

    (You can do all this using relays if you want: A Relay Computer).

    Our Program Again

    The von Neumann computer:

    Computer Architecture (further reading):

    Understanding typical computer bus structures:

    The Future

    Conventional computers cannot be made any smaller or faster out of silicon: we are running out of atoms!

    Today's new computers have multiple conventional computer cores on one chip: for example the "Intel Core 2 Duo" or the AMD "Turion". The cores all sharing the same main memory.

    We are seeing the number of cores on the processor chip growing...

    But the size of an atom and the speed of light are fixed, so a new approach is needed for further performance improvements ... ?

    Thankyou for listening.

    Computer Hardware: How Computers Work

    Dr David Greaves

    Corpus Christi College


    Computer Laboratory

    University of Cambridge

    END. (C) 2008-14 David Greaves.