DRAWBACKS OF LSTM ALGORITHM: A CASE STUDY

15 Pages Posted: 3 Jan 2025

Date Written: January 01, 2025

Abstract

In order to overcome the difficulties of learning long-range dependencies in sequential data, Long Short-Term Memory (LSTM) networks are a well-liked variation of Recurrent Neural Networks (RNNs). LSTMs do, however, have a number of drawbacks. Their intricate architecture, which consists of several gates and memory cells, makes them computationally costly and increases training time and memory consumption. Even with advancements over conventional RNNs, LSTMs still have trouble handling lengthy sequences, especially when it comes to identifying long-term dependencies. Another issue is over fitting, particularly in situations where training data is scarce. Furthermore, LSTMs are frequently criticized for their interpretability issues, which make it challenging to comprehend the logic underlying their predictions. Their use is further complicated by training instability, hyper parameter sensitivity, and parallelization constraints. LSTMs are becoming less and less relevant in many applications due to the emergence of alternative models like Transformers, which provide superior parallelization and can better manage long-range dependencies.

Keywords: lstm, rnn, neural networks, stocks, over fitting of algorithm

Suggested Citation

Kandadi, Thirupathi and Shankarlingam, G., DRAWBACKS OF LSTM ALGORITHM: A CASE STUDY (January 01, 2025). Available at SSRN: https://ssrn.com/abstract=5080605 or http://dx.doi.org/10.2139/ssrn.5080605

G. Shankarlingam

Chaitanya Deemed to be University ( email )

Hanamkonda
India

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
364
Abstract Views
2,354
Rank
178,733
PlumX Metrics