Avatar

Jonathan Gordon

Machine Learning PhD Student

University of Cambridge

About Me

I am a Ph.D. candidate with the Computational and Biological Learning group at the University of Cambridge, supervised by Dr José Miguel Hernández-Lobato and advised by Dr Richard Turner. My research focuses on developing probabilistic models (typically parameterized by deep neural networks) and associated scalable approximate inference procedures.

I am particularly interested in developing deep probabilistic models that capture structure existing in specific data modalities, and exhibit nice behaviours such as sample and parameter efficiency, generalization, and calibrated uncertainties. Much of my recent work has focused on meta-learning, which enables fast few-shot generalization. Another theme in my research is modelling symmetries in the data, such as by incorporating translation or permutation equivariance.

Interests

  • Probabilistic machine learning
  • Meta-learning and few-shot learning
  • Probabilistic Modelling and Inference
  • Symmetries and equivariance in machine learning

Education

  • PhD in Probabilistic Machine Learning, Present

    University of Cambridge

  • MPhil in Machine Learning, 2017

    University of Cambridge

  • MSc. in Applied Statistics and Engineering, 2016

    Ben-Gurion University of the Negev

  • BSc. in Engineering, 2014

    Ben-Gurion University of the Negev

Recent Publications

Meta-Learning Stationary Stochastic Process Prediction with Convolutional Neural Processes

We extend ConvCNPs to Construct and meta-learn translation equivariant maps from the space of data sets to predictive stochastic …

Predictive Complexity Priors

Defining priors for machine learning models via complexity considerations in function space.

Convolutional Conditional Neural Processes

We extend deep sets to functional embeddings and Neural Processes to include translation equivariant members (Oral Presentaiton).

Permutation Equivariant Models for Compositional Generalization in Language

We propose a link between permutation equivariance and compositional generalization, and provide equivariant language models

Recent Posts

A Gentle Introduction to Deep Sets and Neural Processes

In this post, I will discuss two topics that I have been thinking a lot about recently: Deep Sets and Neural Processes. I'll lay out …

A Brief, High-Level Intro to Amortized VI

In this post I will give a very high-level introduction to the concept of amortized vartiational inference1. Before diving in, let me …

On Model-Based vs. Model-Free AI

An interesting debate has arisen lately in the machine learning community concerning two competing (?) approaches to ML and (more …

Recent Advances in Few-Shot Learning

Few-shot learning is (in my opinion) one of the most interesting and important research areas in ML. It touches at the very core of …

What is a Bayesian Neural Network?

A Bayesian neural network (BNN) refers to extending standard networks by treating the weights as random variables. Thus, training a BNN …

Internships & Experience

 
 
 
 
 

PhD Research Intern

Facebook AI Research

Jul 2019 – Nov 2019 Paris, France
Working with Diane Bouchecourt, David Lopez-Paz, and Marco Baroni on compositional generalization in language models.
 
 
 
 
 

PhD Research Intern

Microsoft Research

Jul 2018 – Nov 2018 Cambridge, Massachusetts
Working with Nicolo Fusi and Francesco Paolo Casale on probabilistic and memory-efficient neural architecture search.

Contact

  • Trumptington st., Cambridge, CB2 1PZ