Defining priors for machine learning models via complexity considerations in function space.
Deriving a form of batch normalization tailored towards the meta-learning models.
We extend deep sets to functional embeddings and Neural Processes to include translation equivariant members (Oral Presentaiton).
We propose a link between permutation equivariance and compositional generalization, and provide equivariant language models
Powerful meta-learning system based on the neural process framework (Spotlight)
We introduce a framework for combining deep generative and discriminative models for semi-supervised learning.
We introduce ML-PIP, a general probabilistic framework for meta-learning
Introducing VERSA, an efficient and flexible few-shot learner based on amortized inference.
A probabilistic and differentiable framework for neural architecture search that improves speed and memory efficiency.