Forward Regression for Ultra-High Dimensional Variable Screening
33 Pages Posted: 10 Apr 2009
Date Written: April 9, 2009
Motivated by the seminal theory of Sure Independence Screening (Fan and Lv, 2008, SIS), we investigate here another popular and classical variable screening method, namely, Forward Regression (FR). Our theoretical analysis reveals that FR can identify all relevant predictors consistently, even if the predictor dimension is substantially larger than the sample size. In particular, if the dimension of the true model is finite, FR can discover all relevant predictors within a finite number of steps. To practically select the "best" candidate from the models generated by FR, the recently proposed BIC criterion of Chen and Chen (2008) can be used. The resulting model can then serve as an excellent starting point, from where many existing variable selection methods (e.g., SCAD and Adaptive LASSO) can be applied directly. FR's outstanding ¿nite sample performances are con¿rmed by extensive numerical studies.
Keywords: Adaptive Lasso, BIC, Forward Regression, LASSO, SCAD, Screening Consistency
JEL Classification: C10, C13
Suggested Citation: Suggested Citation