Neural Diffusion Processes

Author(s): Vincent Dutordoir, Alan Saul, Zoubin Ghahramani, Fergus Simpson
Venue: arXiv
Year: 2022

Paper: https://arxiv.org/abs/2206.03992

Abstract

Gaussian processes provide an elegant framework for specifying prior and posterior distributions over functions. They are, however, also computationally expensive, and limited by the expressivity of their covariance function. We propose Neural Diffusion Processes (NDPs), a novel approach based upon diffusion models, that learn to sample from distributions over functions. Using a novel attention block, we can incorporate properties of stochastic processes, such as exchangeability, directly into the NDP’s architecture. We empirically show that NDPs are able to capture functional distributions that are close to the true Bayesian posterior of a Gaussian process. This enables a variety of downstream tasks, including hyperparameter marginalisation and Bayesian optimisation.

Additional information