Suppr超能文献

EEG-ERnet: Emotion Recognition based on Rhythmic EEG Convolutional Neural Network Model.

作者信息

Zhang Shuang, Ling Chen, Wu Jingru, Li Jiawen, Wang Jiujiang, Yu Yuanyu, Liu Xin, Lv Jujian, Vai Mang I, Chen Rongjun

机构信息

Key Laboratory of Numerical Simulation of Sichuan Provincial Universities, School of Mathematics and Information Sciences, Neijiang Normal University, 641000 Neijiang, Sichuan, China.

School of Artificial Intelligence, Neijiang Normal University, 641004 Neijiang, Sichuan, China.

出版信息

J Integr Neurosci. 2025 Aug 28;24(8):41547. doi: 10.31083/JIN41547.

Abstract

BACKGROUND

Emotion recognition from electroencephalography (EEG) can play a pivotal role in the advancement of brain-computer interfaces (BCIs). Recent developments in deep learning, particularly convolutional neural networks (CNNs) and hybrid models, have significantly enhanced interest in this field. However, standard convolutional layers often conflate characteristics across various brain rhythms, complicating the identification of distinctive features vital for emotion recognition. Furthermore, emotions are inherently dynamic, and neglecting their temporal variability can lead to redundant or noisy data, thus reducing recognition performance. Complicating matters further, individuals may exhibit varied emotional responses to identical stimuli due to differences in experience, culture, and background, emphasizing the necessity for subject-independent classification models.

METHODS

To address these challenges, we propose a novel network model based on depthwise parallel CNNs. Power spectral densities (PSDs) from various rhythms are extracted and projected as 2D images to comprehensively encode channel, rhythm, and temporal properties. These rhythmic image representations are then processed by a newly designed network, EEG-ERnet (Emotion Recognition Network), developed to process the rhythmic images for emotion recognition.

RESULTS

Experiments conducted on the dataset for emotion analysis using physiological signals (DEAP) using 10-fold cross-validation demonstrate that emotion-specific rhythms within 5-second time intervals can effectively support emotion classification. The model achieves average classification accuracies of 93.27 ± 3.05%, 92.16 ± 2.73%, 90.56 ± 4.44%, and 86.68 ± 5.66% for valence, arousal, dominance, and liking, respectively.

CONCLUSIONS

These findings provide valuable insights into the rhythmic characteristics of emotional EEG signals. Furthermore, the EEG-ERnet model offers a promising pathway for the development of efficient, subject-independent, and portable emotion-aware systems for real-world applications.

摘要

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验