skip to primary navigationskip to content

Department of Computer Science and Technology

Information Theory

Course pages 2020–21

Information Theory


Lecture notes


Suggested schedule for lecture recordings and study:

9 October: Lecture 1 (slides 1 - 16)
Foundations

12 October: Lecture 2 (slides 17 - 30)
Entropies defined

14 October: Lecture 3 (slides 31 - 43)
Distances and Markov sources

16 October: Lecture 4 (slides 44 - 53)
Variable-length and Huffman codes

19 October: Lecture 5 (slides 54 - 68)
Discrete channel capacity and error-correcting codes

Exercises 1 through 4, by 20 October.

21 October: Lecture 6 (slides 69 - 83)
Information projections into vector spaces

23 October: Lecture 7 (slides 84 - 98)
Fourier representations of information

26 October: Lecture 8 (slides 99 - 114)
Spectral properties, noise, and capacity of continuous channels

28 October: Lecture 9 (slides 115 - 125)
Analytical tools for aperiodic signals and data

30 October: Lecture 10 (slides 126 - 134)
Continuous-time encodings and demodulation schemes

2 November: Lecture 11 (slides 135 - 148)
Quantised degrees-of-freedom in continuous signals

Exercises 5 through 9, by 3 November.

4 November: Lecture 12 (slides 149 - 163)
Information diagram, Uncertainty Principle, and Gabor wavelets

6 November: Lecture 13 (slides 164 - 177)
Discrete, and Fast Fourier Transforms with butterfly algorithm

9 November: Lecture 14 (slides 178 - 190)
Analysis by wavelets, and image compression protocols

11 November: Lecture 15 (slides 191 - 203)
Kolmogorov complexity, astrophysics, and genomics

Exercises 10 through 14, by 12 November.

13 November: Lecture 16 (slides 204 - 220)
Applications of information theory in neuroscience and pattern recognition


20 November, 3:00pm: Q&A session (see Zoom invitation)

Follow-up to the Q&A session: PDF copies of the Exercise Solutions:

Solutions for Exercises 1-4
Solutions for Exercises 5-9
Solutions for Exercises 10-14