Talk on an approximate MCMC sampling approach to Bayesian deep learning.
Talk on an approximate MCMC sampling approach to Bayesian deep learning.
This project aims to connect further the parametric world of neural networks to the non-parametric world of GPs by advancing NNGPs.
This project aims to develop approximate MCMC methods for Bayesian deep learning in order to quantify the uncertainty of predictions made by neural networks.
Markov chain Monte Carlo (MCMC) methods have not been broadly adopted in Bayesian neural networks (BNNs). This paper initially reviews the main challenges in sampling from the parameter posterior of a neural network via MCMC. Such challenges …
Talk on the predictive capacity of Bayesian marginalization for neural networks in the absence of MCMC convergence.
There has recently been much work on the 'wide limit' of neural networks, where Bayesian neural networks (BNNs) are shown to converge to a Gaussian process (GP) as all hidden layers are sent to infinite width. However, these results do not apply to …