A Sensing Data and Deep Learning-Based Sign Language Recognition Approach
30 Pages Posted: 16 Nov 2023
Abstract
Correct sign language recognition helps deaf people to communicate normally with hearing people. Recent research has focused on sensing data-based sign language recognition. Most of them use traditional machine learning approaches, which cannot extract the spatio-temporal features of sign language gestures well. Although deep learning has been used to recognize dynamic sign language gestures, the accuracy still needs to be improved. In this paper, we propose an approach that combines multiple networks to enhance real-time accurate sign language recognition. Firstly, we design a one-dimensional convolutional neural network (CNN) by leveraging the skip connection approach, which improves the data feature extraction ability of the neural network by integrating the feature results of different layers. Secondly, we propose an improved multi-head attention mechanism that incorporates bi-directional long short-term memory (BiLSTM) networks within a multi-head attention mechanism to capture the internal temporal properties in sign language and further extract the crucial motion features. Thirdly, we position this improved multi-head attention mechanism behind the final convolutional layer of the proposed CNN architecture and denote the resultant architecture as BMCNN. Finally, we verify the performance of our approach with BMCNN architecture through ten-fold cross-validation on the sensing dataset. Our approach achieves an accuracy of 99.44% in sign language recognition. The results show that our proposed approach outperforms the traditional machine learning approaches and other state-of-the-art techniques in this field.
Keywords: Sign language recognition, Sensing data, Skip connection CNN, Multi-head attention mechanism, BiLSTM
Suggested Citation: Suggested Citation