In-Vehicle Emotion Recognition and Regulation System Based on Manual Feature Extraction
18 Pages Posted: 16 May 2025
Abstract
As driving becomes an increasingly common mode of transportation, the automotive industry has placed a growing emphasis on both driving safety and the overall travel experience. A significant body of research on these topics highlights the crucial influence of emotions. In this article, we introduce an innovative in-car emotion recognition and interaction system, designed to intelligently respond to drivers' emotional states. The system captures real-time emotional data through a user input layer and seamlessly integrates it into the vehicle's technological architecture, housed within the CPU. By leveraging advanced deep learning models for emotion recognition, the system activates personalized emotion regulation strategies within the interaction feedback layer. Notably, our study presents a novel speech fusion feature, MFCCs+, specifically designed for driving contexts. Additionally, we have optimized the driving speech emotion recognition model using 1D-CNN, achieving a remarkable 10% improvement in recognition accuracy. Validation experiments further confirm the system's effectiveness in enhancing driving safety. In conclusion, the integration of emotion-based interaction solutions has vast potential to improve both driving safety and the travel experience in intelligent driving scenarios. This innovation is poised to shape the future of automotive travel, offering safer and more enjoyable journeys for all.
Keywords: Human machine interaction (HMI), Automotive travel, Feature selection, Speech emotion recognition, Machine Learning, Emotion regulation
Suggested Citation: Suggested Citation