Sequential Monte Carlo for Hierarchical Bayes with Large Datasets

17 Pages Posted: 30 May 2019

See all articles by Remi Daviet

Remi Daviet

University of Pennsylvania - Marketing Department

Date Written: May 5, 2019

Abstract

Practical use of Hierarchical Bayes models require the availability of efficient methods for posterior inference. Sequential Monte Carlo methods have appeared as an extremely robust way to simulate complicated Bayesian posteriors. The simplest version successively weights and resample draws called particles from a sequence of target distributions. The method has a main weakness keeping it from being used in complex hierarchical models and large data sets: sample impoverishment. This issue is usually alleviated through the use of a "refreshing" MCMC step. However, to preserve effectiveness, the computational costs are quadratically increasing with the total number of observations. We propose a new SMC-within-SMC method. In a first step, each individual-level parameter is estimated separately using standard SMC and a non-hierarchical auxiliary prior. In a second step, we use weighting methods to replace the auxiliary prior with the hierarchical one without the need to recompute any likelihood. In addition to allowing for the separate processing of individual data, this approach drastically reduces the computational costs. A MATLAB package is provided.

Keywords: Sequential Monte Carlo, Sample Degeneracy, Sample Impoverishment, Hierarchical Bayes, Big Data

Suggested Citation

Daviet, Remi, Sequential Monte Carlo for Hierarchical Bayes with Large Datasets (May 5, 2019). Available at SSRN: https://ssrn.com/abstract=3382624 or http://dx.doi.org/10.2139/ssrn.3382624

Remi Daviet (Contact Author)

University of Pennsylvania - Marketing Department ( email )

700 Jon M. Huntsman Hall
3730 Walnut Street
Philadelphia, PA 19104-6340
United States

Register to save articles to
your library

Register

Paper statistics

Downloads
12
Abstract Views
80
PlumX Metrics