Jacob Daniel Moss

I'm a PhD student in Machine Learning at the University of Cambridge Computer Lab, where I am supervised by Prof. Pietro Lió and Jeremy England. My research focuses on:

  • Differential equation models of genetic regulation;
  • Models of single-cell RNA, accessibility, and methylation;
  • Latent Force Models;
  • Stochastic Neural ODEs;
  • Graph machine learning.

I am also working on an Introduction to Probabilistic Machine Learning booklet, which is targeted at people with a background in computer science. It covers topics like MAP, Gaussian Processes, MCMC, Variational Inference, Stochastic Calculus, and more. It is updated regularly.

Previously, I did a Master's in Advanced Computer Science at the University of Cambridge. Further to my academic experience, I have worked on a number of industry projects, such as a non-invasive clinically certified heart rate monitor wristband, which won Innovator of the Year at the 2018 Future Health Summit, and an auction house asset-price prediction system for a quantitative trading firm.

Please feel free to contact me if you would like to collaborate on any of the research areas listed above!

Selected Publications

Approximate Latent Force Model Inference

Physically-inspired latent force models offer an interpretable alternative to purely data driven tools for inference in dynamical systems. They carry the structure of differential equations and the flexibility of Gaussian processes, yielding interpretable parameters and dynamics-imposed latent functions. However, the existing inference techniques associated with these models rely on the exact computation of posterior kernel terms which are seldom available in analytical form. Most applications relevant to practitioners, such as Hill equations or diffusion equations, are hence intractable. In this paper, we overcome these computational problems by constructing a variational solution to a general class of non-linear and parabolic partial differential equation latent force models. Further, we show that a neural operator approach can scale our model to thousands of instances, enabling fast, distributed computation.

Moss, J. D., Opolka, F. L., Dumitrascu, D., Liò, P.
In Science-Guided AI Symposium at AAAI 2021.
Also in NeurIPS 2021 workshop on Machine Learning Learning and the Physical Sciences.
Paper

Meta-learning using privileged information for dynamics

In the physical sciences, we often have access to structured knowledge in addition to raw observations of a system, such as the value of a conserved quantity or a description of an understood component. Taking advantage of the aggregation flexibility, we extend the Neural ODE Process model to use additional information within the Learning Using Privileged Information setting, and we validate our extension with experiments showing improved accuracy and calibration on simulated dynamics tasks.

Day, B., Norcliffe, A., Moss, J. D., & Liò, P.
In ICLR 2021 workshops on Learning to Learn and SimDL.
Paper

Neural ODE Processes

We introduce Neural ODE Processes (NDPs), a new class of stochastic processes determined by a distribution over Neural ODEs. By maintaining an adaptive data-dependent distribution over the underlying ODE, we show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points. At the same time, we demonstrate that NDPs scale up to challenging high-dimensional time-series with unknown latent dynamics such as rotating MNIST digits.

Norcliffe, A., Bodnar, C., Day, B., Moss, J. D., & Liò, P.
In International Conference on Learning Representations. ICLR, 2021
Also published as a workshop paper at NeurIPS workshop on Machine Learning and the Physical Sciences. NeurIPS, 2020
Paper | Code

Gene Regulatory Network Inference with Latent Force Models

Moss, J. D., and Liò, P. . Arxiv, 2020.

Talks

Teaching & Supervising