Zero variance differential geometric Markov chain Monte Carlo algorithms


Differential geometric Markov Chain Monte Carlo (MCMC) strategies exploit the geometry of the target to achieve convergence in fewer MCMC iterations at the cost of increased computing time for each of the iterations. Such computational complexity is regarded as a potential shortcoming of geometric MCMC in practice. This paper suggests that part of the additional computing required by Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms produces elements that allow concurrent implementation of the zero variance reduction technique for MCMC estimation. Therefore, zero variance geometric MCMC emerges as an inherently unified sampling scheme, in the sense that variance reduction and geometric exploitation of the parameter space can be performed simultaneously without exceeding the computational requirements posed by the geometric MCMC scheme alone. A MATLAB package is provided, which implements a generic code framework of the combined methodology for a range of models.

In Bayesian Analysis
Theodore Papamarkou
Theodore Papamarkou
Reader in maths of data science

My research interests include approximate inference and complexity theory.