skip to primary navigationskip to content

Department of Computer Science and Technology

Masters

 

Course pages 2022–23

Representation Learning on Graphs and Networks

Principal lecturers: Prof Pietro Lio', Dr Petar Veličković
Taken by: MPhil ACS, Part III
Code: L45
Term: Lent
Hours: 16 (12hrs lectures + 4hrs practicals)
Class limit: max. 30 students
Prerequisites: A basic background in machine learning with deep neural networks
Moodle, timetable

Aims and Objectives

Most of the patterns we see in nature are elegantly representable using the language of graph structures. Prominent examples include molecules (represented as graphs of atoms and bonds), social networks and transportation networks. Several already-impacted application areas include traffic forecasting, drug discovery, social network analysis and recommender systems. Further, some of the most successful domains of application for machine learning in previous years---images, text and speech processing---can be seen as special cases of graph representation learning, and consequently there has been significant exchange of information between these areas. The module will provide the students the capability to analyse graph-structured data in an effective way, and position graph representation learning in a proper context with related fields. The main aim of the course is to enable students to make direct contributions to the field of graph representation learning, thoroughly assimilate the key concepts in the area, and draw relevant connections to various other fields (such as NLP, Fourier Analysis and Probabilistic Graphical Models). We assume only a basic background in machine learning with deep neural networks. The course aims to empower the students to discover new ideas in this area in future years.

Learning outcomes

  • Fundamentals of processing data on graphs, as well as impactful application areas for graph representation learning
  • Theoretical principles of graph learning: permutation invariance and equivariance
  • The three "flavours" of spatial graph neural networks (GNNs) (convolutional, attentional, message passing) and their relative merits
  • Relevant connections of graph representation learning to various other fields (such as NLP, Fourier Analysis and Probabilistic Graphical Models)
  • The "bigger picture" of GNNs within a broader geometric deep learning framework.

Lectures

The lectures will cover the following topics:

  • Why study data on graphs? Success stories: drug screening, travel time estimation, recommender systems. Fundamentals of graph data processing: network science, spectral clustering, node embeddings.
  • Inductive biases, with a brief look at CNNs. Permutation invariance and equivariance on sets and graphs. The principal tasks of node, edge and graph classification. Neural networks for point clouds: Deep Sets, PointNet; universal approximation properties.
  • The three flavours of spatial GNNs: convolutional, attentional, message passing. Prominent examples: ChebyNets, GCN, SGC, MoNet, GAT, GaAN, IN, MPNN, GraphNets. Tradeoffs of using different GNN variants.
  • Revisiting node embeddings, and their link to conv-GNNs. Self-supervised learning with GNNs: random-walk objectives (VGAE), contrastive (DGI, GRACE), bootstrapping (BGRL), feature masking. The effectiveness of randomly initialised GNNs.
  • Revisiting Deep Sets: how to apply GNNs when there is no graph? Links to natural language processing---Transformers as a special case of attentional GNNs. Methodologies for latent graph inference: DGCNN, NRI, DGM, PGN.
  • Expressive power of graph neural networks: the Weisfeiler-Lehman hierarchy. GINs as a maximally expressive GNN. Links between GNNs and graph algorithms.
  • The bigger picture of learning with invariances and symmetries: geometric deep learning. Worked examples: Circulant matrices on grids, the discrete Fourier transform, and convolutional networks on spheres. Graph Fourier transform and the Laplacian eigenbasis. Implications to spatial GNN flavours.

Practicals

The practicals are designed to complement the knowledge learnt in lectures and teach students to derive additional important results and architectures not directly shown in lectures. The practicals will be given as a series of individual exercises (each either code implementation or proof/derivation). Each of these exercises can be individually assessed based on a specified mark budget.

Possible practical topics include the study of higher-order GNNs and equivariant message passing.

Assessment

  • (60%) Mini-project (writeup) at the end of the course. The mini projects can either be self-proposed, or the students can express their preference for one of the provided topics, the list of which will be announced at the start of term. The projects will consist of implementing and/or extending graph representation learning models in the literature, applying them to publicly available datasets. Students will undertake the project in pairs and submit a joint writeup would be limited to 4,000 words (in line with other modules); appendix of work logs to be included but ungraded;

  • (10%) Short presentation and viva: students will give a short presentation to explain their individual contribution to the mini-project and there will be a short viva following.

  • (30%) Practical work completion. Completing the exercises specified in the practicals to a satisfactory standard. The practical assessor should be satisfied that the student derived their answers using insight gained from the course; coupled with original thought, not by simple copy-pasting of relevant related work. The students would submit code and a short report, which would then be marked in line with the predetermined mark budget for each practical item.

The students will learn how to run advanced architectures on GPU but no specific need for dedicated GPU resources. Practicals will be made possible to do on CPU; if required, students can use GPUs on publicly available free services (such as Colab) for their mini-project work.

References

The course will be based on the following literature:

  • "Graph Representation Learning", by Will Hamilton
  • "Geometric Deep Learning: Grids, Graphs, Groups, Geodesics, and Gauges", by Michael Bronstein, Joan Bruna, Taco Cohen and Petar Veličković
  • "Deep Learning", by Ian Goodfellow, Yoshua Bengio and Aaron Courville.

Further Information

Due to infectious respiratory diseases, the method of teaching for this module may be adjusted to cater for physical distancing and students who are working remotely. Unless otherwise advised, this module will be taught in person.