DRAWBACKS OF LSTM ALGORITHM: A CASE STUDY
15 Pages Posted: 3 Jan 2025
Date Written: January 01, 2025
Abstract
In order to overcome the difficulties of learning long-range dependencies in sequential data, Long Short-Term Memory (LSTM) networks are a well-liked variation of Recurrent Neural Networks (RNNs). LSTMs do, however, have a number of drawbacks. Their intricate architecture, which consists of several gates and memory cells, makes them computationally costly and increases training time and memory consumption. Even with advancements over conventional RNNs, LSTMs still have trouble handling lengthy sequences, especially when it comes to identifying long-term dependencies. Another issue is over fitting, particularly in situations where training data is scarce. Furthermore, LSTMs are frequently criticized for their interpretability issues, which make it challenging to comprehend the logic underlying their predictions. Their use is further complicated by training instability, hyper parameter sensitivity, and parallelization constraints. LSTMs are becoming less and less relevant in many applications due to the emergence of alternative models like Transformers, which provide superior parallelization and can better manage long-range dependencies.
Keywords: lstm, rnn, neural networks, stocks, over fitting of algorithm
Suggested Citation: Suggested Citation