Speeding Up MCMC by Delayed Acceptance and Data Subsampling
Riksbank Research Paper Series No. 131
Sveriges Riksbank Working Paper Series No. 307
24 Pages Posted: 22 Dec 2015
Date Written: August 2015
Abstract
The complexity of Markov Chain Monte Carlo (MCMC) algorithms arises from the requirement of a likelihood evaluation for the full data set in each iteration. Payne and Mallick (2014) propose to speed up the Metropolis-Hastings algorithm by a delayed acceptance approach where the acceptance decision proceeds in two stages. In the first stage, an estimate of the likelihood based on a random subsample determines if it is likely that the draw will be accepted and, if so, the second stage uses the full data likelihood to decide upon final acceptance. Evaluating the full data likelihood is thus avoided for draws that are unlikely to be accepted. We propose a more precise likelihood estimator which incorporates auxiliary information about the full data likelihood while only operating on a sparse set of the data. It is proved that the resulting delayed acceptance MCMC is asymptotically more efficient compared to that of Payne and Mallick (2014). Furthermore, we adapt the method to handle data sets that are too large to fit in Random-Access Memory (RAM). This adaptation results in an algorithm that samples from an approximate posterior with well studied theoretical properties in the literature.
Keywords: Bayesian inference, Markov chain Monte Carlo, Delayed acceptance MCMC, Large data, Survey sampling
JEL Classification: C11, C13, C15, C55, C83
Suggested Citation: Suggested Citation