On the Sparsity of Mallows’ Model Averaging Estimator

12 Pages Posted: 25 Jul 2019 Last revised: 29 Jul 2019

See all articles by Yang Feng

Yang Feng

New York University (NYU) - School of Global Public Health

Qingfeng Liu

Hosei University - Department of Industrial and Systems Engineering

Ryo Okui

University of Tokyo - Graduate School of Economics

Date Written: July 23, 2019

Abstract

We show that Mallows' model averaging estimator proposed by Hansen (2007) can be written as a least squares estimation with a weighted L1 penalty and additional constraints. By exploiting this representation, we demonstrate that the weight vector obtained by this model averaging procedure has a sparsity property in the sense that a subset of models receives exactly zero weights. Moreover, this representation allows us to adapt algorithms developed to efficiently solve minimization problems with many parameters and weighted L1 penalty. In particular, we develop a new coordinate-wise descent algorithm for model averaging. Simulation studies show that the new algorithm computes the model averaging estimator much faster and requires less memory than conventional methods when there are many models.

Keywords: sparsity, model averaging, $L_1$ penalty, coordinate-wise descent algorithm

JEL Classification: C51, C52

Suggested Citation

Feng, Yang and Liu, Qingfeng and Okui, Ryo, On the Sparsity of Mallows’ Model Averaging Estimator (July 23, 2019). Available at SSRN: https://ssrn.com/abstract=3425424 or http://dx.doi.org/10.2139/ssrn.3425424

Yang Feng

New York University (NYU) - School of Global Public Health ( email )

United States

Qingfeng Liu (Contact Author)

Hosei University - Department of Industrial and Systems Engineering ( email )

Kajinocho 3-7-2
Koganei, Tokyo 184-8584
Japan

Ryo Okui

University of Tokyo - Graduate School of Economics ( email )

Tokyo
Japan

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
86
Abstract Views
553
Rank
535,304
PlumX Metrics