PDL

The premise of approximate MCMC in Bayesian deep learning (UCSB)

Talk on an approximate MCMC sampling approach to Bayesian deep learning.

The premise of approximate MCMC in Bayesian deep learning (University of Warwick)

Talk on an approximate MCMC sampling approach to Bayesian deep learning.

NNGPs

This project aims to connect further the parametric world of neural networks to the non-parametric world of GPs by advancing NNGPs.

Approximate MCMC for Bayesian deep learning

This project aims to develop approximate MCMC methods for Bayesian deep learning in order to quantify the uncertainty of predictions made by neural networks.

Challenges in Markov chain Monte Carlo for Bayesian neural networks

Markov chain Monte Carlo (MCMC) methods have not been broadly adopted in Bayesian neural networks (BNNs). This paper initially reviews the main challenges in sampling from the parameter posterior of a neural network via MCMC. Such challenges …

Challenges in Markov chain Monte Carlo for Bayesian neural networks

Talk on the predictive capacity of Bayesian marginalization for neural networks in the absence of MCMC convergence.

Wide neural networks with bottlenecks are deep Gaussian processes

There has recently been much work on the 'wide limit' of neural networks, where Bayesian neural networks (BNNs) are shown to converge to a Gaussian process (GP) as all hidden layers are sent to infinite width. However, these results do not apply to …