Deep Learning Diagnostics ‒ How to Avoid Being Fooled by TensorFlow, PyTorch, or MXNet with the Help of Modern Econometrics
Schriftenreihe des Instituts für Empirie & Statistik der FOM Hochschule, Band 24 (2021)
60 Pages Posted: 1 Apr 2021
Date Written: March 12, 2021
Abstract
Training a Multi-Layer Perceptron (MLP) to achieve a minimum level of MSE is akin to doing Non-Linear Regression (NLR). Therefore, we use available econometric theory and the corresponding tools in R. Only if certain assumptions about the error term in the Data Generating Process are in place, may we enjoy the trained MLP as a consistent estimator. To verify the assumptions, careful diagnostics are necessary.
Using controlled experiments we show that even in an ideal setting, an MLP may fail to learn a relationship whereas NLR performs better. We illustrate how the MLP is outperformed by Non-Linear Quantile Regression in the presence of outliers. A third situation in which the MLP is often led astray is where there is no relationship and the MLP still learns a relationship producing high levels of R². We show that circumventing the trap of spurious learning is only possible with the help of diagnostics.
Keywords: Non-Linear Regression, MLP, spurious regression, spurious learning
JEL Classification: C01, C02, C19, C55
Suggested Citation: Suggested Citation