Sequential Monte Carlo for Hierarchical Bayes with Large Datasets

17 Pages Posted: 30 May 2019

See all articles by Remi Daviet

Remi Daviet

Wisconsin School of Business

Date Written: May 5, 2019

Abstract

Practical use of Hierarchical Bayes models require the availability of efficient methods for posterior inference. Sequential Monte Carlo methods have appeared as an extremely robust way to simulate complicated Bayesian posteriors. The simplest version successively weights and resample draws called particles from a sequence of target distributions. The method has a main weakness keeping it from being used in complex hierarchical models and large data sets: sample impoverishment. This issue is usually alleviated through the use of a "refreshing" MCMC step. However, to preserve effectiveness, the computational costs are quadratically increasing with the total number of observations. We propose a new SMC-within-SMC method. In a first step, each individual-level parameter is estimated separately using standard SMC and a non-hierarchical auxiliary prior. In a second step, we use weighting methods to replace the auxiliary prior with the hierarchical one without the need to recompute any likelihood. In addition to allowing for the separate processing of individual data, this approach drastically reduces the computational costs. A MATLAB package is provided.

Keywords: Sequential Monte Carlo, Sample Degeneracy, Sample Impoverishment, Hierarchical Bayes, Big Data

Suggested Citation

Daviet, Remi, Sequential Monte Carlo for Hierarchical Bayes with Large Datasets (May 5, 2019). Available at SSRN: https://ssrn.com/abstract=3382624 or http://dx.doi.org/10.2139/ssrn.3382624

Remi Daviet (Contact Author)

Wisconsin School of Business ( email )

975 University Avenue
Madison, WI 53706
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
246
Abstract Views
1,147
Rank
267,521
PlumX Metrics