On the Computational Complexity of MCMC-Based Estimators in Large Samples
38 Pages Posted: 1 Mar 2007
Date Written: February 23, 2007
Abstract
This paper studies the computational complexity of Bayesian and quasi-Bayesian estimation in large samples carried out using a basic Metropolis random walk. The framework covers cases where the underlying likelihood or extremum criterion function is possibly non-concave, discontinuous, and of increasing dimension. Using a central limit framework to provide structural restrictions for the problem, it is shown that the algorithm is computationally efficient. Specifically, it is shown that the running time of the algorithm in large samples is bounded in probability by a polynomial in the parameter dimension d, and in particular is of stochastic order d2 in the leading cases after the burn-in period. The reason is that, in large samples, a central limit theorem implies that the posterior or quasi-posterior approaches a normal density, which restricts the deviations from continuity and concavity in a specific manner, so that the computational complexity is polynomial. An application to exponential and curved exponential families of increasing dimension is given.
Keywords: Computational Complexity, Metropolis, Large Samples, Sampling, Integration, Exponential family, Moment restrictions
JEL Classification: C1, C11, C15, C6, C63
Suggested Citation: Suggested Citation
Do you have negative results from your research you’d like to share?
Recommended Papers
-
L1-Penalized Quantile Regression in High Dimensional Sparse Models
-
Dual Extrapolation and its Applications for Solving Variational Inequalities and Related Problems'
-
Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming
By Alexandre Belloni, Victor Chernozhukov, ...