*No. of lectures:* 12

*Prerequisite courses: Continuous Mathematics, Probability,
Discrete Mathematics*

**Overview and historical origins: foundations and uncertainty.**Why the movements and transformations of information, just like those of a fluid, are law-governed. How concepts of randomness, redundancy, compressibility, noise, bandwidth, and uncertainty are intricately connected to information. Origins of these ideas and the various forms that they take.**Mathematical foundations; probability rules; Bayes' theorem.**The meanings of probability. Ensembles, random variables, marginal and conditional probabilities. How the formal concepts of information are grounded in the principles and rules of probability.**Entropies defined, and why they are measures of information.**Marginal entropy, joint entropy, conditional entropy, and the Chain Rule for entropy. Mutual information between ensembles of random variables. Why entropy is the fundamental measure of information content.**Source coding theorem; prefix, variable-, and fixed-length codes.**Symbol codes. The binary symmetric channel. Capacity of a noiseless discrete channel. Error correcting codes.**Channel types, properties, noise, and channel capacity.**Perfect communication through a noisy channel. Capacity of a discrete channel as the maximum of its mutual information over all possible input distributions.**Continuous information; density; noisy channel coding theorem.**Extensions of the discrete entropies and measures to the continuous case. Signal-to-noise ratio; power spectral density. Gaussian channels. Relative significance of bandwidth and noise limitations. The Shannon rate limit and efficiency for noisy continuous channels.**Fourier series, convergence, orthogonal representation.**Generalised signal expansions in vector spaces. Independence. Representation of continuous or discrete data by complex exponentials. The Fourier basis. Fourier series for periodic functions. Examples.**Useful Fourier theorems; transform pairs. Sampling; aliasing.**The Fourier transform for non-periodic functions. Properties of the transform, and examples. Nyquist's Sampling Theorem derived, and the cause (and removal) of aliasing.**Discrete Fourier transform. Fast Fourier Transform Algorithms.**Efficient algorithms for computing Fourier transforms of discrete data. Computational complexity. Filters, correlation, modulation, demodulation, coherence.**The quantised degrees-of-freedom in a continuous signal.**Why a continuous signal of finite bandwidth and duration has a fixed number of degrees-of-freedom. Diverse illustrations of the principle that information, even in such a signal, comes in quantised, countable, packets.**Gabor-Heisenberg-Weyl uncertainty relation. Optimal ``Logons''.**Unification of the time-domain and the frequency-domain as endpoints of a continuous deformation. The Uncertainty Principle and its optimal solution by Gabor's expansion basis of ``logons''. Multi-resolution wavelet codes. Extension to images, for analysis and compression.**Kolmogorov complexity. Minimal description. Time series.**Definition of the algorithmic complexity of a data sequence, and its relation to the entropy of the distribution from which the data was drawn. Shortest possible description length. Wiener analysis and the information in a time-series.

- calculate the information content of a random variable
from its probability distribution
- relate the joint, conditional, and marginal entropies
of variables in terms of their coupled probabilities
- define channel capacities and properties using Shannon's Theorems
- construct efficient codes for data on imperfect communication channels
- generalise the discrete concepts to continuous signals on continuous
channels
- understand Fourier Transforms and the main ideas of efficient
algorithms for them
- describe the information resolution and compression properties of wavelets

starting

- Lecture Notes (Postscript file.)
- Learning Guide and exercise problems (Postscript file.) DVI version.
- Syllabus

**Past exam questions**