Information Theory and Coding

University of Cambridge Computer Laboratory

Lecturer: Dr John Daugman

Taken by: Part II
No. of lectures: 12

Prerequisite courses: Continuous Mathematics, Probability, Discrete Mathematics


Aims

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective; and extensions to wavelets, complexity, and time series.

Lectures

Objectives

At the end of the course students should be able to

Recommended book

Cover, T.M. & Thomas, J.A. (1991). Elements of Information Theory. New York: Wiley.


  • Learning Guide and exercise problems (Postscript file.) DVI version.
  • Syllabus
  • Past exam questions
  • Examples Classes
  • Lecture Notes (Postscript file.)