Suppr超能文献

用于脊髓损伤的时频空间变压器脑电图解码

Time-frequency-space transformer EEG decoding for spinal cord injury.

作者信息

Xu Fangzhou, Liu Ming, Chen Xinyi, Yan Yihao, Zhao Jinzhao, Liu Yanbing, Zhao Jiaqi, Pang Shaopeng, Yin Sen, Leng Jiancai, Zhang Yang

机构信息

International School for Optoelectronic Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250353 People's Republic of China.

School of Information and Automation Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250353 Shandong Province China.

出版信息

Cogn Neurodyn. 2024 Dec;18(6):3491-3506. doi: 10.1007/s11571-024-10135-8. Epub 2024 Jun 18.

Abstract

Transformer neural networks based on multi-head self-attention are effective in several fields. To capture brain activity on electroencephalographic (EEG) signals and construct an effective pattern recognition model, this paper explores the multi-channel deep feature decoding method utilizing the self-attention mechanism. By integrating inter-channel features with intra-channel features, the self-attention mechanism generates a deep feature vector that encompasses information from all brain activities. In this paper, a time-frequency-spatial domain analysis of motor imagery (MI) based EEG signals from spinal cord injury patients is performed to construct a transformer neural network-based MI classification model. The proposed algorithm is named time-frequency-spatial transformer. The time-frequency and spatial domain feature vectors extracted from the EEG signals are input into the transformer neural network for multiple self-attention depth feature encoding, a peak classification accuracy of 93.56% is attained through the fully connected layer. By constructing the attention matrix brain network, it can be inferred that the channel connections constructed by the attention heads have similarities to the brain networks constructed by the EEG raw signals. The experimental results reveal that the self-attention coefficient brain network holds significant potential for brain activity analysis. The self-attention coefficient brain network can better illustrate correlated connections and show sample differences. Attention coefficient brain networks can provide a more discriminative approach for analyzing brain activity in clinical settings.

摘要

基于多头自注意力的Transformer神经网络在多个领域都很有效。为了捕捉脑电图(EEG)信号中的大脑活动并构建有效的模式识别模型,本文探索了利用自注意力机制的多通道深度特征解码方法。通过将通道间特征与通道内特征相结合,自注意力机制生成了一个包含所有大脑活动信息的深度特征向量。本文对脊髓损伤患者基于运动想象(MI)的EEG信号进行时频空域分析,以构建基于Transformer神经网络的MI分类模型。所提出的算法被命名为时频空间Transformer。从EEG信号中提取的时频和空域特征向量被输入到Transformer神经网络中进行多次自注意力深度特征编码,通过全连接层获得了93.56%的峰值分类准确率。通过构建注意力矩阵脑网络,可以推断出由注意力头构建的通道连接与由EEG原始信号构建的脑网络具有相似性。实验结果表明,自注意力系数脑网络在大脑活动分析方面具有巨大潜力。自注意力系数脑网络可以更好地说明相关连接并展示样本差异。注意力系数脑网络可以为临床环境中的大脑活动分析提供一种更具区分性的方法。

相似文献

1
Time-frequency-space transformer EEG decoding for spinal cord injury.用于脊髓损伤的时频空间变压器脑电图解码
Cogn Neurodyn. 2024 Dec;18(6):3491-3506. doi: 10.1007/s11571-024-10135-8. Epub 2024 Jun 18.

本文引用的文献

6
A framework for motor imagery with LSTM neural network.基于长短期记忆神经网络的运动想象框架。
Comput Methods Programs Biomed. 2022 May;218:106692. doi: 10.1016/j.cmpb.2022.106692. Epub 2022 Feb 19.
7
ViTT: Vision Transformer Tracker.ViTT:视觉Transformer跟踪器。
Sensors (Basel). 2021 Aug 20;21(16):5608. doi: 10.3390/s21165608.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验