Information Theory and Coding

University of Cambridge Computer Laboratory

Lecturer: Dr John Daugman

Taken by: Part II

Prerequisite courses: Continuous Mathematics, Probability, Discrete Mathematics

No. of lectures: 12 (Tues, Thurs at noon)

First lecture: Tuesday 9 October, 12:00, Rayleigh Lecture Theatre


The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective; and extensions to wavelets, complexity, and time series.



At the end of the course students should be able to

Recommended book

Cover, T.M. & Thomas, J.A. (1991). Elements of Information Theory. New York: Wiley.

  • Learning Guide and exercise problems (PDF version.) DVI version.
  • Syllabus
  • Past exam questions
  • Lecture Notes (PDF format)