Information Theory and Coding

University of Cambridge Computer Laboratory

Lecturer: Dr J.G. Daugman (jgd1000@cl.cam.ac.uk)

No. of lectures: 12

Prerequisite courses: Continuous Mathematics, Probability, Discrete Mathematics


Aims

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective; and extensions to wavelets, complexity, and time series.

Lectures

Objectives

At the end of the course students should be able to

Recommended book

Cover, T.M. & Thomas, J.A. (1991). Elements of Information Theory. New York: Wiley.


Lecturer: Dr John Daugman (jgd1000@cl.cam.ac.uk)
Taken by: Part II
Number of lectures: 12
Lecture location: Heycock Room
Lecture times: 10:00 on MWF starting 09-Oct-98


II