Theodore Papamarkou
Theodore Papamarkou
Home
Posts
Events
Publications
Contact
Light
Dark
Automatic
BDL
Bayesian neural networks and dimensionality reduction
In conducting non-linear dimensionality reduction and feature learning, it is common to suppose that the data lie near a …
Deborshee Sen
,
Theodore Papamarkou
,
David Dunson
Cite
Source Document
Towards efficient MCMC sampling in Bayesian neural networks by exploiting symmetry
Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density …
Jonas Gregor Wiese
,
Lisa Wimmer
,
Theodore Papamarkou
,
Bernd Bischl
,
Stephan Günnemann
,
David Rügamer
Cite
Project
Source Document
Approximate blocked Gibbs sampling for Bayesian neural networks
In this work, minibatch MCMC sampling for feedforward neural networks is made more feasible. To this end, it is proposed to sample …
Theodore Papamarkou
Cite
Project
Source Document
The premise of approximate MCMC in Bayesian deep learning (UCSB)
Talk on an approximate MCMC sampling approach to Bayesian deep learning.
Sep 26, 2022 3:30 PM — 4:30 PM
University of California, Santa Barbara, Department of Computer Science
The premise of approximate MCMC in Bayesian deep learning (University of Warwick)
Talk on an approximate MCMC sampling approach to Bayesian deep learning.
Sep 8, 2022 9:10 AM — 9:35 AM
University of Warwick, Department of Statistics
Slides
Video
Approximate MCMC for Bayesian deep learning
This project aims to develop approximate MCMC methods for Bayesian deep learning in order to quantify the uncertainty of predictions made by neural networks.
Follow
Follow
Challenges in Markov chain Monte Carlo for Bayesian neural networks
Markov chain Monte Carlo (MCMC) methods have not been broadly adopted in Bayesian neural networks (BNNs). This paper initially reviews …
Theodore Papamarkou
,
Jacob Hinkle
,
M. Todd Young
,
David Womble
Cite
Project
Source Document
Challenges in Markov chain Monte Carlo for Bayesian neural networks
Talk on the predictive capacity of Bayesian marginalization for neural networks in the absence of MCMC convergence.
Apr 23, 2021 3:00 PM — 4:00 PM
University of Glasgow, School of Mathematics and Statistics
Code
Video
Wide neural networks with bottlenecks are deep Gaussian processes
There has recently been much work on the ‘wide limit’ of neural networks, where Bayesian neural networks (BNNs) are shown …
Devanshu Agrawal
,
Theodore Papamarkou
,
Jacob Hinkle
PDF
Cite
Code
Source Document
Cite
×