LASSO Regularization within the LocalGLMnet Architecture

29 Pages Posted: 24 Sep 2021 Last revised: 1 Jun 2022

See all articles by Ronald Richman

Ronald Richman

Old Mutual Insure; University of the Witwatersrand

Mario V. Wuthrich

RiskLab, ETH Zurich

Date Written: September 20, 2021

Abstract

Deep learning models have been very successful in the application of machine learning methods, often out-performing classical statistical models such as linear regression models or generalized linear models. On the other hand, deep learning models are often criticized for not being explainable nor allowing for variable selection. There are two different ways of dealing with this problem, either we use post-hoc model interpretability methods or we design specific deep learning architectures that allow for (more) easy interpretation and explanation.

This paper builds on our previous work on the LocalGLMnet architecture that gives an interpretable deep learning architecture. In the present paper, we show how group LASSO regularization (and other regularization schemes) can be implemented within the LocalGLMnet architecture so that we receive feature sparsity for variable selection. We benchmark our approach with the recently developed LassoNet of Lemhadri et al.

Keywords: Deep learning, neural networks, LocalGLMnet, regression model, variable selection, regularization, LASSO, group LASSO, ridge regularization, Tikhonov regularization.

JEL Classification: C14

Suggested Citation

Richman, Ronald and Wuthrich, Mario V., LASSO Regularization within the LocalGLMnet Architecture (September 20, 2021). Available at SSRN: https://ssrn.com/abstract=3927187 or http://dx.doi.org/10.2139/ssrn.3927187

Ronald Richman (Contact Author)

Old Mutual Insure ( email )

Wanooka Place
St Andrews Road
Johannesburg, 2192
South Africa

University of the Witwatersrand ( email )

Mario V. Wuthrich

RiskLab, ETH Zurich ( email )

Department of Mathematics
Ramistrasse 101
Zurich, 8092
Switzerland

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
437
Abstract Views
1,457
Rank
138,526
PlumX Metrics