Back close

Single-Modality Emotion Detection: EEG-Based Feature Engineering and Interpretability

Publication Type : Conference Paper

Publisher : IEEE

Source : 2025 Eleventh International Conference on Bio Signals, Images, and Instrumentation (ICBSII)

Url : https://doi.org/10.1109/icbsii65145.2025.11013139

Campus : Coimbatore

School : School of Artificial Intelligence

Year : 2025

Abstract : Emotion recognition plays a vital role in human-computer interaction, offering applications in healthcare, education, and entertainment. This research proposes a single modality paradigm for emotion detection leveraging EEG signals from the DREAMER dataset to predict the valence, arousal, and dominance. Unlike the other approaches that rely on using computationally expensive neural networks, we have utilized lightweight machine learning models and advanced feature engineering methods to achieve accuracies of 98.6%, 96.6%, and 95.7% respectively which are pretty close to the state of art model accuracies while having a relatively much lower inference time of just 9.78 milli seconds. Another highlight of our research is that we were able to determine which EEG channels and characteristics had the greatest influence on each emotional dimension, which further improves the interpretability of our research. Our approach highlights the possibility for scalable and explicable emotion identification systems utilizing EEG data alone by combining high accuracy, efficiency, and interpretability.

Cite this Research Publication : As Vishal, S Deepan, V Amrutha, Single-Modality Emotion Detection: EEG-Based Feature Engineering and Interpretability, 2025 Eleventh International Conference on Bio Signals, Images, and Instrumentation (ICBSII), IEEE, 2025, https://doi.org/10.1109/icbsii65145.2025.11013139

Admissions Apply Now