skip to primary navigationskip to content

Department of Computer Science and Technology



Course pages 2021–22

Representation Learning on Graphs and Networks

Principal lecturers: Prof Pietro Lio', Dr Petar Veličković
Taken by: MPhil ACS, Part III
Code: L45
Hours: 16 (12hrs lectures + 4hrs practicals)
Class limit: max. 30 students
Prerequisites: A basic background in machine learning with deep neural networks

Aims and Objectives

Most of the patterns we see in nature are elegantly representable using the language of graph structures. Prominent examples include molecules (represented as graphs of atoms and bonds), social networks and transportation networks. Several already-impacted application areas include traffic forecasting, drug discovery, social network analysis and recommender systems. Further, some of the most successful domains of application for machine learning in previous years---images, text and speech processing---can be seen as special cases of graph representation learning, and consequently there has been significant exchange of information between these areas. The module will provide the students the capability to analyse graph-structured data in an effective way, and position graph representation learning in a proper context with related fields. The main aim of the course is to enable students to make direct contributions to the field of graph representation learning, thoroughly assimilate the key concepts in the area, and draw relevant connections to various other fields (such as NLP, Fourier Analysis and Probabilistic Graphical Models). We assume only a basic background in machine learning with deep neural networks. The course aims to empower the students to discover new ideas in this area in future years.

Learning outcomes

  • Fundamentals of processing data on graphs, as well as impactful application areas for graph representation learning
  • Theoretical principles of graph learning: permutation invariance and equivariance
  • The three "flavours" of spatial graph neural networks (GNNs) (convolutional, attentional, message passing) and their relative merits
  • Relevant connections of graph representation learning to various other fields (such as NLP, Fourier Analysis and Probabilistic Graphical Models)
  • The "bigger picture" of GNNs within a broader geometric deep learning framework.


The lectures will cover the following topics:

  • Why study data on graphs? Success stories: drug screening, travel time estimation, recommender systems. Fundamentals of graph data processing: network science, spectral clustering, node embeddings.
  • Inductive biases, with a brief look at CNNs. Permutation invariance and equivariance on sets and graphs. The principal tasks of node, edge and graph classification. Neural networks for point clouds: Deep Sets, PointNet; universal approximation properties.
  • The three flavours of spatial GNNs: convolutional, attentional, message passing. Prominent examples: ChebyNets, GCN, SGC, MoNet, GAT, GaAN, IN, MPNN, GraphNets. Tradeoffs of using different GNN variants.
  • Revisiting node embeddings, and their link to conv-GNNs. Self-supervised learning with GNNs: random-walk objectives (VGAE), contrastive (DGI, GRACE), bootstrapping (BGRL), feature masking. The effectiveness of randomly initialised GNNs.
  • Revisiting Deep Sets: how to apply GNNs when there is no graph? Links to natural language processing---Transformers as a special case of attentional GNNs. Methodologies for latent graph inference: DGCNN, NRI, DGM, PGN.
  • Expressive power of graph neural networks: the Weisfeiler-Lehman hierarchy. GINs as a maximally expressive GNN. Links between GNNs and graph algorithms.
  • The bigger picture of learning with invariances and symmetries: geometric deep learning. Worked examples: Circulant matrices on grids, the discrete Fourier transform, and convolutional networks on spheres. Graph Fourier transform and the Laplacian eigenbasis. Implications to spatial GNN flavours.


The practicals are designed to complement the knowledge learnt in lectures and teach students to derive additional important results and architectures not directly shown in lectures. The practicals will be given as a series of individual exercises (each either code implementation or proof/derivation). Each of these exercises can be individually assessed based on a specified mark budget.

Possible practical topics include the study of higher-order GNNs and equivariant message passing.


  • (70%) Mini-project (writeup) at the end of the course (in the students' area of choice within graph representation learning). The mini projects can be self-proposed, and will consist of implementing and/or extending graph representation learning models in the literature, applying them to publicly available datasets. The writeup would be limited to 4,000 words (in line with other modules);
  • (30%) Practical work completion. Completing the exercises specified in the practicals to a satisfactory standard. The practical assessor should be satisfied that the student derived their answers using insight gained from the course; coupled with original thought, not by simple copy-pasting of relevant related work. The students would submit code and a short report, which would then be marked in line with the predetermined mark budget for each practical item.

The students will learn how to run advanced architectures on GPU but no specific need for dedicated GPU resources. Practicals will be made possible to do on CPU; if required, students can use GPUs on publicly available free services (such as Colab) for their mini-project work.


The course will be based on the following literature:

  • "Graph Representation Learning", by Will Hamilton
  • "Geometric Deep Learning: Grids, Graphs, Groups, Geodesics, and Gauges", by Michael Bronstein, Joan Bruna, Taco Cohen and Petar Veličković
  • "Deep Learning", by Ian Goodfellow, Yoshua Bengio and Aaron Courville.

Further Information

Due to COVID-19, the method of teaching for this module will be adjusted to cater for physical distancing and students who are working remotely. We will confirm precisely how the module will be taught closer to the start of term.