Importance Sampling Squared for Bayesian Inference in Latent Variable Models
39 Pages Posted: 28 Jan 2014
Date Written: January 25, 2014
Abstract
We consider Bayesian inference by importance sampling when the likelihood is analytically intractable but can be unbiasedly estimated. We refer to this procedure as importance sampling squared (IS2), as we can often estimate the likelihood itself by importance sampling. We provide a formal justification for importance sampling when working with an estimate of the likelihood and study its convergence properties. We analyze the effect of estimating the likelihood on the resulting inference and provide guidelines on how to set up the precision of the likelihood estimate in order to obtain an optimal tradeoff between computational cost and accuracy for posterior inference on the model parameters. We illustrate the procedure in empirical applications for a generalized multinomial logit model and a stochastic volatility model. The results show that the IS2 method can lead to fast and accurate posterior inference under the optimal implementation.
Keywords: Efficient importance sampling, marginal likelihood, multinomial logit, particle marginal Metropolis-Hastings, optimal number of particles, stochastic volatility
JEL Classification: C32, C51, E43
Suggested Citation: Suggested Citation