Bayesian Nonparametric Sparse Seemingly Unrelated Regression Model (SUR)
38 Pages Posted: 1 Sep 2016 Last revised: 31 Jul 2017
Date Written: July 31, 2017
Abstract
Seemingly unrelated regression (SUR) models are useful in studying the interactions among different variables. In a high dimensional setting or when applied to large panel of time series, these models require a large number of parameters to be estimated and suffer of inferential problems.
To avoid overparametrization and overfitting issues, we propose a hierarchical Dirichlet process prior for SUR models, which allows shrinkage of SUR coefficients toward multiple locations and identication of group of coefficients. We propose a two-stage hierarchical prior distribution, where the first stage of the hierarchy consists in a Lasso conditionally independent prior distribution of the Normal-Gamma family for the SUR coefficients. The second stage is given by a random mixture distribution for the Normal-Gamma hyperparameters, which allows for parameter parsimony through two components: the first one is a random Dirac point-mass distribution, which induces sparsity in the SUR coefficients; the second is a Dirichlet process prior, which allows for clustering of the SUR coefficients.
Our sparse SUR model with multiple locations, scales and shapes includes the Vector autoregressive models (VAR) and dynamic panel models as special cases. We consider an international business cycle applications to show the effectiveness of our model and inference approach. Our new multiple shrinkage prior model allows us to better understand shock transmission phenomena, to extract coloured networks and to classify the linkages strenght. The empirical results represent a different point of view on international business cycles providing interesting new findings in the relationship between core and pheriphery countries.
Keywords: Bayesian nonparametrics; Bayesian model selection; Shrinkage; Large vector autoregression; Network representation; Connectedness
JEL Classification: 11, C13, C14, C32, C51, E17
Suggested Citation: Suggested Citation