Deep Learning

Predictive Complexity Priors

Defining priors for machine learning models via complexity considerations in function space.

TaskNorm: Rethinking Batch Normalization for Meta-Learning

Deriving a form of batch normalization tailored towards the meta-learning models.

Convolutional Conditional Neural Processes

We extend deep sets to functional embeddings and Neural Processes to include translation equivariant members (Oral Presentaiton).

Permutation Equivariant Models for Compositional Generalization in Language

We propose a link between permutation equivariance and compositional generalization, and provide equivariant language models

Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes

Powerful meta-learning system based on the neural process framework (Spotlight)

Combining Deep Generative and Discriminative Models for Bayesian Semi-Supervised Learning

We introduce a framework for combining deep generative and discriminative models for semi-supervised learning.

Meta Learning Probabilistic Inference for Prediction

We introduce ML-PIP, a general probabilistic framework for meta-learning

VERSA: Versatile and Efficient Few-shot Learning

Introducing VERSA, an efficient and flexible few-shot learner based on amortized inference.

Probabilistic Neural Architecture Search

A probabilistic and differentiable framework for neural architecture search that improves speed and memory efficiency.