• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于脑电图-功能近红外光谱技术,利用图卷积和胶囊注意力网络的情绪识别

EEG-fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network.

作者信息

Chen Guijun, Liu Yue, Zhang Xueying

机构信息

College of Electronic Information and Optical Engineering, Taiyuan University of Technology, Taiyuan 030024, China.

出版信息

Brain Sci. 2024 Aug 16;14(8):820. doi: 10.3390/brainsci14080820.

DOI:10.3390/brainsci14080820
PMID:39199511
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11352237/
Abstract

Electroencephalogram (EEG) and functional near-infrared spectroscopy (fNIRS) can objectively reflect a person's emotional state and have been widely studied in emotion recognition. However, the effective feature fusion and discriminative feature learning from EEG-fNIRS data is challenging. In order to improve the accuracy of emotion recognition, a graph convolution and capsule attention network model (GCN-CA-CapsNet) is proposed. Firstly, EEG-fNIRS signals are collected from 50 subjects induced by emotional video clips. And then, the features of the EEG and fNIRS are extracted; the EEG-fNIRS features are fused to generate higher-quality primary capsules by graph convolution with the Pearson correlation adjacency matrix. Finally, the capsule attention module is introduced to assign different weights to the primary capsules, and higher-quality primary capsules are selected to generate better classification capsules in the dynamic routing mechanism. We validate the efficacy of the proposed method on our emotional EEG-fNIRS dataset with an ablation study. Extensive experiments demonstrate that the proposed GCN-CA-CapsNet method achieves a more satisfactory performance against the state-of-the-art methods, and the average accuracy can increase by 3-11%.

摘要

脑电图(EEG)和功能近红外光谱(fNIRS)能够客观反映人的情绪状态,且在情绪识别领域已得到广泛研究。然而,从EEG-fNIRS数据中进行有效的特征融合和判别性特征学习具有挑战性。为提高情绪识别的准确率,提出了一种图卷积与胶囊注意力网络模型(GCN-CA-CapsNet)。首先,从50名受情绪视频片段诱发的受试者中采集EEG-fNIRS信号。然后,提取EEG和fNIRS的特征;通过与皮尔逊相关邻接矩阵进行图卷积,融合EEG-fNIRS特征以生成更高质量的初级胶囊。最后,引入胶囊注意力模块为初级胶囊分配不同权重,并在动态路由机制中选择更高质量的初级胶囊以生成更好的分类胶囊。我们通过消融研究在我们的情绪EEG-fNIRS数据集上验证了所提方法的有效性。大量实验表明,所提的GCN-CA-CapsNet方法相对于现有方法取得了更令人满意的性能,平均准确率可提高3% - 11%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/96d210105d4c/brainsci-14-00820-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/114e3366224d/brainsci-14-00820-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/b180ae7a8ffa/brainsci-14-00820-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/2576f3b99f39/brainsci-14-00820-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/bf783a73f4b6/brainsci-14-00820-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/ddab19fa7f6d/brainsci-14-00820-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/e9ea5ce37bd4/brainsci-14-00820-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/7bcd20a14034/brainsci-14-00820-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/a8bd0ac37952/brainsci-14-00820-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/ab9a9d662a71/brainsci-14-00820-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/e1f369d48e0c/brainsci-14-00820-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/96d210105d4c/brainsci-14-00820-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/114e3366224d/brainsci-14-00820-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/b180ae7a8ffa/brainsci-14-00820-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/2576f3b99f39/brainsci-14-00820-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/bf783a73f4b6/brainsci-14-00820-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/ddab19fa7f6d/brainsci-14-00820-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/e9ea5ce37bd4/brainsci-14-00820-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/7bcd20a14034/brainsci-14-00820-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/a8bd0ac37952/brainsci-14-00820-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/ab9a9d662a71/brainsci-14-00820-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/e1f369d48e0c/brainsci-14-00820-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38f4/11352237/96d210105d4c/brainsci-14-00820-g011.jpg

相似文献

1
EEG-fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network.基于脑电图-功能近红外光谱技术,利用图卷积和胶囊注意力网络的情绪识别
Brain Sci. 2024 Aug 16;14(8):820. doi: 10.3390/brainsci14080820.
2
Semi-supervised EEG emotion recognition model based on enhanced graph fusion and GCN.基于增强图融合和 GCN 的半监督 EEG 情绪识别模型。
J Neural Eng. 2022 Apr 14;19(2). doi: 10.1088/1741-2552/ac63ec.
3
An Efficient Graph Learning System for Emotion Recognition Inspired by the Cognitive Prior Graph of EEG Brain Network.一种受脑电图脑网络认知先验图启发的高效情感识别图学习系统。
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):7130-7144. doi: 10.1109/TNNLS.2024.3405663. Epub 2025 Apr 4.
4
Granger-Causality-Based Multi-Frequency Band EEG Graph Feature Extraction and Fusion for Emotion Recognition.基于格兰杰因果关系的多频段脑电图图形特征提取与融合用于情感识别
Brain Sci. 2022 Dec 1;12(12):1649. doi: 10.3390/brainsci12121649.
5
3DCANN: A Spatio-Temporal Convolution Attention Neural Network for EEG Emotion Recognition.3DCANN:用于 EEG 情绪识别的时空卷积注意力神经网络。
IEEE J Biomed Health Inform. 2022 Nov;26(11):5321-5331. doi: 10.1109/JBHI.2021.3083525. Epub 2022 Nov 10.
6
Spatial-temporal features-based EEG emotion recognition using graph convolution network and long short-term memory.基于时空特征的脑电图情感识别:利用图卷积网络和长短期记忆
Physiol Meas. 2023 Jun 8;44(6). doi: 10.1088/1361-6579/acd675.
7
STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition.STGATE:基于脑电图的情感识别的带变压器编码器的时空图注意力网络。
Front Hum Neurosci. 2023 Apr 13;17:1169949. doi: 10.3389/fnhum.2023.1169949. eCollection 2023.
8
MES-CTNet: A Novel Capsule Transformer Network Base on a Multi-Domain Feature Map for Electroencephalogram-Based Emotion Recognition.MES-CTNet:一种基于多域特征图的新型胶囊变压器网络,用于基于脑电图的情感识别。
Brain Sci. 2024 Mar 30;14(4):344. doi: 10.3390/brainsci14040344.
9
SCC-MPGCN: self-attention coherence clustering based on multi-pooling graph convolutional network for EEG emotion recognition.SCC-MPGCN:基于多池化图卷积网络的自注意力相干聚类的 EEG 情绪识别。
J Neural Eng. 2022 Apr 21;19(2). doi: 10.1088/1741-2552/ac6294.
10
Hierarchical Dynamic Graph Convolutional Network With Interpretability for EEG-Based Emotion Recognition.具有可解释性的分层动态图卷积网络用于基于脑电图的情绪识别
IEEE Trans Neural Netw Learn Syst. 2022 Dec 9;PP. doi: 10.1109/TNNLS.2022.3225855.

引用本文的文献

1
Decoding basic emotional states through integration of an fNIRS-based brain-computer interface with supervised learning algorithms.通过将基于功能近红外光谱技术的脑机接口与监督学习算法相结合来解码基本情绪状态。
PLoS One. 2025 Jul 14;20(7):e0325850. doi: 10.1371/journal.pone.0325850. eCollection 2025.
2
Emotion Recognition Based on a EEG-fNIRS Hybrid Brain Network in the Source Space.基于源空间中脑电-功能近红外光谱混合脑网络的情绪识别
Brain Sci. 2024 Nov 22;14(12):1166. doi: 10.3390/brainsci14121166.

本文引用的文献

1
Cross-Subject Emotion Recognition Brain-Computer Interface Based on fNIRS and DBJNet.基于功能近红外光谱技术(fNIRS)和深度双向跳跃连接网络(DBJNet)的跨主体情绪识别脑机接口
Cyborg Bionic Syst. 2023 Jul 27;4:0045. doi: 10.34133/cbsystems.0045. eCollection 2023.
2
A multi-head residual connection GCN for EEG emotion recognition.一种用于 EEG 情绪识别的多头残差连接 GCN。
Comput Biol Med. 2023 Sep;163:107126. doi: 10.1016/j.compbiomed.2023.107126. Epub 2023 Jun 2.
3
ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement.
ST-CapsNet:通过胶囊网络连接空间和时间注意力以改进P300检测
IEEE Trans Neural Syst Rehabil Eng. 2023;31:991-1000. doi: 10.1109/TNSRE.2023.3237319. Epub 2023 Feb 3.
4
TC-Net: A Transformer Capsule Network for EEG-based emotion recognition.TC-Net:一种用于基于脑电图的情绪识别的Transformer胶囊网络。
Comput Biol Med. 2023 Jan;152:106463. doi: 10.1016/j.compbiomed.2022.106463. Epub 2022 Dec 22.
5
Granger-Causality-Based Multi-Frequency Band EEG Graph Feature Extraction and Fusion for Emotion Recognition.基于格兰杰因果关系的多频段脑电图图形特征提取与融合用于情感识别
Brain Sci. 2022 Dec 1;12(12):1649. doi: 10.3390/brainsci12121649.
6
Deep learning in fNIRS: a review.功能近红外光谱中的深度学习:综述
Neurophotonics. 2022 Oct;9(4):041411. doi: 10.1117/1.NPh.9.4.041411. Epub 2022 Jul 20.
7
Linking Multi-Layer Dynamical GCN With Style-Based Recalibration CNN for EEG-Based Emotion Recognition.将多层动态图卷积网络与基于风格的重校准卷积神经网络相结合用于基于脑电图的情感识别
Front Neurorobot. 2022 Feb 24;16:834952. doi: 10.3389/fnbot.2022.834952. eCollection 2022.
8
Emotion recognition from EEG based on multi-task learning with capsule network and attention mechanism.基于胶囊网络和注意力机制的多任务学习从脑电图中进行情绪识别。
Comput Biol Med. 2022 Apr;143:105303. doi: 10.1016/j.compbiomed.2022.105303. Epub 2022 Feb 19.
9
Multi-Modal Integration of EEG-fNIRS for Characterization of Brain Activity Evoked by Preferred Music.用于表征偏好音乐诱发的大脑活动的脑电图-功能近红外光谱多模态整合
Front Neurorobot. 2022 Jan 31;16:823435. doi: 10.3389/fnbot.2022.823435. eCollection 2022.
10
FGANet: fNIRS-Guided Attention Network for Hybrid EEG-fNIRS Brain-Computer Interfaces.FGANet:用于混合 EEG-fNIRS 脑机接口的 fNIRS 引导注意力网络。
IEEE Trans Neural Syst Rehabil Eng. 2022;30:329-339. doi: 10.1109/TNSRE.2022.3149899. Epub 2022 Feb 16.