On the Sparsity of Mallows’ Model Averaging Estimator

12 Pages Posted: 25 Jul 2019 Last revised: 29 Jul 2019

See all articles by Yang Feng

Yang Feng

New York University (NYU) - School of Global Public Health

Qingfeng Liu

Otaru University of Commerce - Department of Economics

Ryo Okui

Seoul National University

Date Written: July 23, 2019

Abstract

We show that Mallows' model averaging estimator proposed by Hansen (2007) can be written as a least squares estimation with a weighted L1 penalty and additional constraints. By exploiting this representation, we demonstrate that the weight vector obtained by this model averaging procedure has a sparsity property in the sense that a subset of models receives exactly zero weights. Moreover, this representation allows us to adapt algorithms developed to efficiently solve minimization problems with many parameters and weighted L1 penalty. In particular, we develop a new coordinate-wise descent algorithm for model averaging. Simulation studies show that the new algorithm computes the model averaging estimator much faster and requires less memory than conventional methods when there are many models.

Keywords: sparsity, model averaging, $L_1$ penalty, coordinate-wise descent algorithm

JEL Classification: C51, C52

Suggested Citation

Feng, Yang and Liu, Qingfeng and Okui, Ryo, On the Sparsity of Mallows’ Model Averaging Estimator (July 23, 2019). Available at SSRN: https://ssrn.com/abstract=3425424 or http://dx.doi.org/10.2139/ssrn.3425424

Yang Feng

New York University (NYU) - School of Global Public Health ( email )

United States

Qingfeng Liu (Contact Author)

Otaru University of Commerce - Department of Economics ( email )

3-5-21 Midori
Otaru City, Hokkaido 047-8501
Japan

Ryo Okui

Seoul National University ( email )

Seoul
Korea, Republic of (South Korea)

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
42
Abstract Views
320
PlumX Metrics