L1-Penalized Quantile Regression in High Dimensional Sparse Models
61 Pages Posted: 30 Apr 2009
Date Written: April 19, 2009
Abstract
We consider median regression and, more generally, quantile regression in high-dimensional sparse models. In these models the overall number of regressors p is very large, possibly larger than the sample size n, but only s of these regressors have non-zero impact on the conditional quantile of the response variable, where s grows slower than n. Since in this case the ordinary quantile regression is not consistent, we consider quantile regression penalized by the 1-norm of coefficients (L1-QR). First, we show that L1-QR is consistent, up to a logarithmic factor, at the oracle rate which is achievable when the minimal true model is known. The overall number of regressors p affects the rate only through a logarithmic factor, thus allowing nearly exponential growth in the number of zero-impact regressors. The rate result holds under relatively weak conditions, requiring that s/n converges to zero at a super-logarithmic speed and that regularization parameter satisfies certain theoretical constraints. Second, we propose a pivotal, data-driven choice of the regularization parameter and show that it satisfies these theoretical constraints. Third, we show that L1-QR correctly selects the true minimal model as a valid submodel, when the non-zero coefficients of the true model are well separated from zero. We also show that the number of non-zero coefficients in L1-QR is of same stochastic order as s, the number of non-zero coefficients in the minimal true model. Fourth, we analyze the rate of convergence of a two-step estimator that applies ordinary quantile regression to the selected model. Fifth, we evaluate the performance of L1-QR in a Monte-Carlo experiment, and provide an application to the analysis of the international economic growth.
Keywords: median regression, quantile regression, sparse models
JEL Classification: C13, C14, C30, C51, D4, J24, J31
Suggested Citation: Suggested Citation
Do you have a job opening that you would like to promote on SSRN?
Recommended Papers
-
On the Computational Complexity of MCMC-Based Estimators in Large Samples
-
Dual Extrapolation and its Applications for Solving Variational Inequalities and Related Problems'
-
Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming
By Alexandre Belloni, Victor Chernozhukov, ...