Bayesian Parallel Computation for Intractable Likelihood Using Griddy-Gibbs Sampler
13 Pages Posted: 5 Dec 2013
Date Written: December 4, 2013
Parallel computation is a fast growing computing environment in many areas including computational Bayesian statistics. However, most of the Bayesian parallel computing have been implemented through the sequential Monte Carlo method where model parameters are updated sequentially and it is suitable for some large-scale problems. This paper is the first to revive the use of adaptive griddy Gibbs (AGG) algorithm under the Markov chain Monte Carlo framework and how how to implement the AGG using the parallel computation. The parallel AGG is suitable for (i) small to medium-scale problems where the dimension of model parameter space is not very high, (ii) some or all model parameters are defined or known on a specific interval, and (iii) model likelihood is intractable. In addition, the parallel AGG is relatively easy to implement and code. A simulation study of three examples including a linear regression model with Student-t error, a nonlinear regression model, and a financial time series model, and an empirical study are illustrated to show the applicability of the AGG using the parallel computing environment.
Keywords: Bayesian analysis, Griddy Gibbs, Parallel computing, Markov chain Monte Carlo
Suggested Citation: Suggested Citation