• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

SFT-SGAT:一种半监督微调的自监督图注意力网络,用于情绪识别和意识检测。

SFT-SGAT: A semi-supervised fine-tuning self-supervised graph attention network for emotion recognition and consciousness detection.

机构信息

School of Artificial Intelligence, South China Normal University, Guangzhou, 510630, China; Research Station in Mathematics, South China Normal University, Guangzhou, 510630, China.

School of Artificial Intelligence, South China Normal University, Guangzhou, 510630, China.

出版信息

Neural Netw. 2024 Dec;180:106643. doi: 10.1016/j.neunet.2024.106643. Epub 2024 Aug 22.

DOI:10.1016/j.neunet.2024.106643
PMID:39186838
Abstract

Emotional recognition is highly important in the field of brain-computer interfaces (BCIs). However, due to the individual variability in electroencephalogram (EEG) signals and the challenges in obtaining accurate emotional labels, traditional methods have shown poor performance in cross-subject emotion recognition. In this study, we propose a cross-subject EEG emotion recognition method based on a semi-supervised fine-tuning self-supervised graph attention network (SFT-SGAT). First, we model multi-channel EEG signals by constructing a graph structure that dynamically captures the spatiotemporal topological features of EEG signals. Second, we employ a self-supervised graph attention neural network to facilitate model training, mitigating the impact of signal noise on the model. Finally, a semi-supervised approach is used to fine-tune the model, enhancing its generalization ability in cross-subject classification. By combining supervised and unsupervised learning techniques, the SFT-SGAT maximizes the utility of limited labeled data in EEG emotion recognition tasks, thereby enhancing the model's performance. Experiments based on leave-one-subject-out cross-validation demonstrate that SFT-SGAT achieves state-of-the-art cross-subject emotion recognition performance on the SEED and SEED-IV datasets, with accuracies of 92.04% and 82.76%, respectively. Furthermore, experiments conducted on a self-collected dataset comprising ten healthy subjects and eight patients with disorders of consciousness (DOCs) revealed that the SFT-SGAT attains high classification performance in healthy subjects (maximum accuracy of 95.84%) and was successfully applied to DOC patients, with four patients achieving emotion recognition accuracies exceeding 60%. The experiments demonstrate the effectiveness of the proposed SFT-SGAT model in cross-subject EEG emotion recognition and its potential for assessing levels of consciousness in patients with DOC.

摘要

情绪识别在脑机接口(BCI)领域中至关重要。然而,由于脑电图(EEG)信号的个体可变性以及获取准确情绪标签的挑战,传统方法在跨被试情绪识别方面表现不佳。在这项研究中,我们提出了一种基于半监督微调自监督图注意网络(SFT-SGAT)的跨被试 EEG 情绪识别方法。首先,我们通过构建一个动态捕获 EEG 信号时空拓扑特征的图结构来对多通道 EEG 信号进行建模。其次,我们采用自监督图注意神经网络来促进模型训练,减轻信号噪声对模型的影响。最后,采用半监督方法对模型进行微调,增强其在跨被试分类中的泛化能力。通过结合监督和无监督学习技术,SFT-SGAT 最大限度地利用 EEG 情绪识别任务中有限的标记数据的效用,从而提高模型的性能。基于留一被试交叉验证的实验表明,SFT-SGAT 在 SEED 和 SEED-IV 数据集上实现了最先进的跨被试情绪识别性能,准确率分别为 92.04%和 82.76%。此外,在由 10 名健康受试者和 8 名意识障碍(DOC)患者组成的自采集数据集上进行的实验表明,SFT-SGAT 在健康受试者中实现了高分类性能(最大准确率为 95.84%),并成功应用于 DOC 患者,其中 4 名患者的情绪识别准确率超过 60%。实验证明了所提出的 SFT-SGAT 模型在跨被试 EEG 情绪识别中的有效性及其在评估 DOC 患者意识水平方面的潜力。

相似文献

1
SFT-SGAT: A semi-supervised fine-tuning self-supervised graph attention network for emotion recognition and consciousness detection.SFT-SGAT:一种半监督微调的自监督图注意力网络,用于情绪识别和意识检测。
Neural Netw. 2024 Dec;180:106643. doi: 10.1016/j.neunet.2024.106643. Epub 2024 Aug 22.
2
ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection.ST-SCGNN:一种用于基于跨主体 EEG 的情绪识别和意识检测的时空自构建图神经网络。
IEEE J Biomed Health Inform. 2024 Feb;28(2):777-788. doi: 10.1109/JBHI.2023.3335854. Epub 2024 Feb 5.
3
Cross-subject emotion recognition in brain-computer interface based on frequency band attention graph convolutional adversarial neural networks.基于频带注意力图卷积对抗神经网络的脑机接口跨主体情绪识别。
J Neurosci Methods. 2024 Nov;411:110276. doi: 10.1016/j.jneumeth.2024.110276. Epub 2024 Sep 3.
4
EEG-based emotion charting for Parkinson's disease patients using Convolutional Recurrent Neural Networks and cross dataset learning.基于 EEG 的帕金森病患者情绪图表分析,使用卷积循环神经网络和跨数据集学习。
Comput Biol Med. 2022 May;144:105327. doi: 10.1016/j.compbiomed.2022.105327. Epub 2022 Mar 11.
5
Emotion recognition using spatial-temporal EEG features through convolutional graph attention network.基于卷积图注意网络的时空 EEG 特征的情绪识别。
J Neural Eng. 2023 Feb 14;20(1). doi: 10.1088/1741-2552/acb79e.
6
Multi-source Selective Graph Domain Adaptation Network for cross-subject EEG emotion recognition.多源选择性图域自适应网络用于跨被试 EEG 情绪识别。
Neural Netw. 2024 Dec;180:106742. doi: 10.1016/j.neunet.2024.106742. Epub 2024 Sep 24.
7
MSLTE: multiple self-supervised learning tasks for enhancing EEG emotion recognition.多任务自监督学习增强 EEG 情绪识别
J Neural Eng. 2024 Apr 17;21(2). doi: 10.1088/1741-2552/ad3c28.
8
Cross-Subject EEG Emotion Recognition With Self-Organized Graph Neural Network.基于自组织图神经网络的跨主体脑电情感识别
Front Neurosci. 2021 Jun 9;15:611653. doi: 10.3389/fnins.2021.611653. eCollection 2021.
9
Cross-subject emotion recognition using visibility graph and genetic algorithm-based convolution neural network.基于可见性图和遗传算法的跨主题情感识别卷积神经网络。
Chaos. 2022 Sep;32(9):093110. doi: 10.1063/5.0098454.
10
Multi-Scale Masked Autoencoders for Cross-Session Emotion Recognition.多尺度掩蔽自动编码器在跨会话情感识别中的应用。
IEEE Trans Neural Syst Rehabil Eng. 2024;32:1637-1646. doi: 10.1109/TNSRE.2024.3389037. Epub 2024 Apr 22.

引用本文的文献

1
Association prediction of lncRNAs and diseases using multiview graph convolution neural network.基于多视图图卷积神经网络的lncRNA与疾病关联预测
Front Genet. 2025 Apr 15;16:1568270. doi: 10.3389/fgene.2025.1568270. eCollection 2025.
2
A spatial and temporal transformer-based EEG emotion recognition in VR environment.虚拟现实环境中基于时空变换器的脑电图情感识别
Front Hum Neurosci. 2025 Feb 26;19:1517273. doi: 10.3389/fnhum.2025.1517273. eCollection 2025.