Suppr超能文献

跨主体情绪识别中的脑电图特征探索

Exploring EEG Features in Cross-Subject Emotion Recognition.

作者信息

Li Xiang, Song Dawei, Zhang Peng, Zhang Yazhou, Hou Yuexian, Hu Bin

机构信息

Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China.

School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China.

出版信息

Front Neurosci. 2018 Mar 19;12:162. doi: 10.3389/fnins.2018.00162. eCollection 2018.

Abstract

Recognizing cross-subject emotions based on brain imaging data, e.g., EEG, has always been difficult due to the poor generalizability of features across subjects. Thus, systematically exploring the ability of different EEG features to identify emotional information across subjects is crucial. Prior related work has explored this question based only on one or two kinds of features, and different findings and conclusions have been presented. In this work, we aim at a more comprehensive investigation on this question with a wider range of feature types, including 18 kinds of linear and non-linear EEG features. The effectiveness of these features was examined on two publicly accessible datasets, namely, the dataset for emotion analysis using physiological signals (DEAP) and the SJTU emotion EEG dataset (SEED). We adopted the support vector machine (SVM) approach and the "leave-one-subject-out" verification strategy to evaluate recognition performance. Using automatic feature selection methods, the highest mean recognition accuracy of 59.06% (AUC = 0.605) on the DEAP dataset and of 83.33% (AUC = 0.904) on the SEED dataset were reached. Furthermore, using manually operated feature selection on the SEED dataset, we explored the importance of different EEG features in cross-subject emotion recognition from multiple perspectives, including different channels, brain regions, rhythms, and feature types. For example, we found that the Hjorth parameter of mobility in the beta rhythm achieved the best mean recognition accuracy compared to the other features. Through a pilot correlation analysis, we further examined the highly correlated features, for a better understanding of the implications hidden in those features that allow for differentiating cross-subject emotions. Various remarkable observations have been made. The results of this paper validate the possibility of exploring robust EEG features in cross-subject emotion recognition.

摘要

基于脑成像数据(如脑电图(EEG))识别跨主体情绪一直很困难,因为特征在不同主体间的通用性较差。因此,系统地探索不同EEG特征识别跨主体情绪信息的能力至关重要。先前的相关工作仅基于一两种特征探讨了这个问题,并呈现了不同的发现和结论。在这项工作中,我们旨在使用更广泛的特征类型,包括18种线性和非线性EEG特征,对这个问题进行更全面的研究。在两个公开可用的数据集上检验了这些特征的有效性,这两个数据集分别是使用生理信号进行情绪分析的数据集(DEAP)和上海交通大学情绪EEG数据集(SEED)。我们采用支持向量机(SVM)方法和“留一主体法”验证策略来评估识别性能。使用自动特征选择方法,在DEAP数据集上达到了最高平均识别准确率59.06%(曲线下面积(AUC)=0.605),在SEED数据集上达到了83.33%(AUC = 0.904)。此外,在SEED数据集上使用手动特征选择,我们从多个角度探索了不同EEG特征在跨主体情绪识别中的重要性,包括不同通道、脑区、节律和特征类型。例如,我们发现与其他特征相比,β节律中移动性的 Hjorth 参数实现了最佳平均识别准确率。通过初步的相关性分析,我们进一步研究了高度相关的特征,以便更好地理解那些能够区分跨主体情绪的特征中隐藏的含义。我们得出了各种显著的观察结果。本文的结果验证了在跨主体情绪识别中探索稳健EEG特征的可能性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f842/5867345/511ffd86d7cd/fnins-12-00162-g0001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验