We extend ConvCNPs to Construct and meta-learn translation equivariant maps from the space of data sets to predictive stochastic processes.
Deriving a form of batch normalization tailored towards the meta-learning models.
We extend deep sets to functional embeddings and Neural Processes to include translation equivariant members (Oral Presentaiton).
Powerful meta-learning system based on the neural process framework (Spotlight)
We introduce ML-PIP, a general probabilistic framework for meta-learning
A unifying perspective on meta-learning algorithms based on posterior predictive inference.
Introducing VERSA, an efficient and flexible few-shot learner based on amortized inference.