Computer Laboratory > Teaching > Course material 2009–10 > Computer Science Tripos Syllabus and Booklist 2009-2010 > Information Theory and Coding

next up previous contents
Next: Natural Language Processing Up: Michaelmas Term 2009: Part Previous: Information Retrieval   Contents


Information Theory and Coding

Lecturer: Professor J.G. Daugman

No. of lectures + examples classes: 11 + 1

Prerequisite courses: Probability, Discrete Mathematics, Mathematical Methods for Computer Science

Aims

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective; and extensions to wavelets, complexity and compression.

Lectures

Objectives

At the end of the course students should be able to

Recommended reading

* Cover, T.M. & Thomas, J.A. (1991). Elements of information theory. New York: Wiley.



next up previous contents
Next: Natural Language Processing Up: Michaelmas Term 2009: Part Previous: Information Retrieval   Contents