Investigating Long Short-Term Memory Neural Networks for Financial Time-Series Prediction

49 Pages Posted: 22 Jun 2018

See all articles by Tao Tong

Tao Tong

University of California, Berkeley, Haas School of Business, Financial Engineering, Students

Manas Shah

University of California, Berkeley, Haas School of Business, Financial Engineering, Students

Manoj Cherukumalli

University of California, Berkeley, Haas School of Business, Financial Engineering, Students

Yasmine Moulehiawy

University of California, Berkeley, Haas School of Business, Financial Engineering, Students

Date Written: March 5, 2018

Abstract

Stock prices are co-integrated with fundamental valuation factors (earnings, revenue, cash flow, etc.). This has allowed legendary value investors such as Warren Buffett to enjoy tremendous investment success over the long-term. However, the market is efficient at discounting past information and consensus future expectations. Having the ability to read non-obvious information and to have a better prediction at future valuation factors (surprises from consensus estimates) would likely lead to investment outperformance (generating alpha returns) over the long run. In a recently published work, Alberg and Lipton used various artificial neural networks (ANN) to predict companies’ valuation ratios from their historical fundamental factors and suggested promising results. Aligned with this thought, this research project aims at using Long Short-Term Memory (LSTM) neural network, a type of recurrent neural network (RNN), to predict future fundamental valuation factors of companies and then test investment results by applying active risk-return optimized portfolio strategies. The reasons for choosing LSTM network for this study are the following: (1) like a deep neural network, LSTM is a flexible universal function approximator suited for time-series forecast; (2) LSTM, unlike vanilla version of RNN, does not suffer from “vanishing gradient problem” and is well suited in discovering long-range characteristics, hence its name. Given that LSTM network (or deep neural network in general) enjoys a reputation of being prone to over-fitting to in-sample data, we spend a significant amount of efforts in studying the over-fitting behavior and try to lay out systematic procedures in detecting and mitigating such issues. The current work gives us confidence and excitement that much more can be explored to potentially further improve the prediction performance and investment returns.

Suggested Citation

Tong, Tao and Shah, Manas and Cherukumalli, Manoj and Moulehiawy, Yasmine, Investigating Long Short-Term Memory Neural Networks for Financial Time-Series Prediction (March 5, 2018). Available at SSRN: https://ssrn.com/abstract=3175336 or http://dx.doi.org/10.2139/ssrn.3175336

Tao Tong

University of California, Berkeley, Haas School of Business, Financial Engineering, Students ( email )

2220 Piedmont Avenue
Berkeley, CA
United States

Manas Shah (Contact Author)

University of California, Berkeley, Haas School of Business, Financial Engineering, Students ( email )

2220 Piedmont Avenue
Berkeley, CA
United States

Manoj Cherukumalli

University of California, Berkeley, Haas School of Business, Financial Engineering, Students ( email )

2220 Piedmont Avenue
Berkeley, CA
United States

Yasmine Moulehiawy

University of California, Berkeley, Haas School of Business, Financial Engineering, Students ( email )

2220 Piedmont Avenue
Berkeley, CA
United States

Register to save articles to
your library

Register

Paper statistics

Downloads
381
Abstract Views
1,053
rank
76,489
PlumX Metrics