skip to primary navigationskip to content

Department of Computer Science and Technology

Part II CST

 

Course pages 2022–23

Deep Neural Networks

Principal lecturer: Dr Ferenc Huszar
Additional lecturer: Dr Nic Lane
Taken by: Part II CST
Code: DNN
Term: Lent
Hours: 14 (14hrs lectures)
Format: In-person lectures
Class limit: max. 30 students
Moodle, timetable

Objectives

You will gain detailed knowledge of

  1. Current understanding of generalization in neural networks vs classical statistical models.
  2. Optimization procedures for neural network models such as stochastic gradient descent and ADAM.
  3. Automatic differentiation and at least one software framework (PyTorch, TensorFolow) as well as an overview of other software approaches.
  4. Architectures that are deployed to deal with different data types such as images or sequences including a. convolutional networks b. recurrent networks.

In addition you will gain knowledge of more advanced topics reflecting recent research in machine learning chosen from the following list.

  1. Approaches to unsupervised learning including autoencoders and generative adversarial networks.
  2. Techniques for deploying models in low data regimes such as transfer learning and meta-learning.
  3. Techniques for propagating uncertainty such as Bayesian neural networks.
  4. Deployment of neural network models in hardware systems.

Teaching Style

The start of the course will focus on the latest undertanding of current theory of neural networks, contrasting with previous classical understandings of generalization performance. Then we will move to practical examples of network architectures and deployment. We will end with more advanced topics reflecting current research.

Schedule

Week 1

Two lectures: Generalization and Neural architectures.

Week 2

Two lectures: Optimization: Stochastic Gradient Descent and ADAM

Week 3

Two lectures: Background: Automatic differentiation and GPU Acceleration

Week 4

Two lectures: Neural architectures: Convolutional neural networks

Week 5

Two lectures: Neural architectures: Recurrent Neural Networks and LSTMs.

Week 6-8

Lectures from the following list of special topics.

Special topics

  1. Neural architectures: Auto Encoders and Generative Adversarial Networks
  2. Hardware Implementations
  3. Reinforcement learning
  4. Transfer learning and meta-learning
  5. Uncertainty and Bayesian Neural Networks

Assessment

Assignment 1

30% of the total marks.

Assignment 2

70% of the total marks