We extend ConvCNPs to Construct and meta-learn translation equivariant maps from the space of data sets to predictive stochastic processes.
Defining priors for machine learning models via complexity considerations in function space.
Powerful meta-learning system based on the neural process framework (Spotlight)
We reconsider the active learning problem, leveraging advances in Bayesian coresets to relieve the standard myopic assumption.
We introduce a framework for combining deep generative and discriminative models for semi-supervised learning.
Iteratively improving variational posteriors for BNNs with gradient descent and auxiliary variables.
A unifying perspective on meta-learning algorithms based on posterior predictive inference.
Introducing VERSA, an efficient and flexible few-shot learner based on amortized inference.
A probabilistic and differentiable framework for neural architecture search that improves speed and memory efficiency.