Towards Explaining the ReLU Feed-Forward Network

56 Pages Posted: 27 Dec 2019 Last revised: 28 Dec 2019

See all articles by Hasan Fallahgoul

Hasan Fallahgoul

Monash University

Vincentius Franstianto

Monash University

Gregoire Loeper

Monash University - School of Mathematical Sciences; Ecole Centrale Paris

Date Written: December 6, 2019

Abstract

A multi-layer, multi-node ReLU network is a powerful, efficient, and popular tool in statistical prediction tasks. However, in contrast to the great emphasis on its empirical applications, its statistical properties are rarely investigated. To help closing this gap, we establish three asymptotic properties of the ReLU network: consistency, sieve-based convergence rate, and asymptotic normality. To validate the theoretical results, a Monte Carlo analysis is provided.

Keywords: Consistency, Rate of Convergence, Sieve Estimators, Rectified Linear Unit

JEL Classification: G1, C1

Suggested Citation

Fallahgoul, Hasan A and Franstianto, Vincentius and Loeper, Gregoire, Towards Explaining the ReLU Feed-Forward Network (December 6, 2019). Available at SSRN: https://ssrn.com/abstract=3499324 or http://dx.doi.org/10.2139/ssrn.3499324

Hasan A Fallahgoul (Contact Author)

Monash University ( email )

Clayton Campus
Victoria, 3800
Australia

HOME PAGE: http://www.hfallahgoul.com

Vincentius Franstianto

Monash University ( email )

23 Innovation Walk
Wellington Road
Clayton, Victoria 3800
Australia

Gregoire Loeper

Monash University - School of Mathematical Sciences ( email )

Clayton Campus
Victoria, 3800
Australia

Ecole Centrale Paris ( email )

Paris
France

Register to save articles to
your library

Register

Paper statistics

Downloads
26
Abstract Views
220
PlumX Metrics