Sequential Monte Carlo for Hierarchical Bayes with Large Datasets

17 Pages Posted: 30 May 2019

See all articles by Remi Daviet

Remi Daviet

Wharton Marketing Department, University of Pennsylvania

Date Written: May 5, 2019

Abstract

Practical use of Hierarchical Bayes models require the availability of efficient methods for posterior inference. Sequential Monte Carlo methods have appeared as an extremely robust way to simulate complicated Bayesian posteriors. The simplest version successively weights and resample draws called particles from a sequence of target distributions. The method has a main weakness keeping it from being used in complex hierarchical models and large data sets: sample impoverishment. This issue is usually alleviated through the use of a "refreshing" MCMC step. However, to preserve effectiveness, the computational costs are quadratically increasing with the total number of observations. We propose a new SMC-within-SMC method. In a first step, each individual-level parameter is estimated separately using standard SMC and a non-hierarchical auxiliary prior. In a second step, we use weighting methods to replace the auxiliary prior with the hierarchical one without the need to recompute any likelihood. In addition to allowing for the separate processing of individual data, this approach drastically reduces the computational costs. A MATLAB package is provided.

Keywords: Sequential Monte Carlo, Sample Degeneracy, Sample Impoverishment, Hierarchical Bayes, Big Data

Suggested Citation

Daviet, Remi, Sequential Monte Carlo for Hierarchical Bayes with Large Datasets (May 5, 2019). Available at SSRN: https://ssrn.com/abstract=3382624 or http://dx.doi.org/10.2139/ssrn.3382624

Remi Daviet (Contact Author)

Wharton Marketing Department, University of Pennsylvania ( email )

406 North 42nd Street
3730 Walnut Street
Philadelphia, PA 19104
United States
2675066499 (Phone)

Here is the Coronavirus
related research on SSRN

Paper statistics

Downloads
32
Abstract Views
237
PlumX Metrics