IEEE J Biomed Health Inform. 2024 Oct;28(10):5755-5767. doi: 10.1109/JBHI.2024.3395622. Epub 2024 Oct 3.
Due to the objectivity of emotional expression in the central nervous system, EEG-based emotion recognition can effectively reflect humans' internal emotional states. In recent years, convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have made significant strides in extracting local features and temporal dependencies from EEG signals. However, CNNs ignore spatial distribution information from EEG electrodes; moreover, RNNs may encounter issues such as exploding/vanishing gradients and high time consumption. To address these limitations, we propose an attention-based temporal graph representation network (ATGRNet) for EEG-based emotion recognition. Firstly, a hierarchical attention mechanism is introduced to integrate feature representations from both frequency bands and channels ordered by priority in EEG signals. Second, a graph convolutional neural network with top-k operation is utilized to capture internal relationships between EEG electrodes under different emotion patterns. Next, a residual-based graph readout mechanism is applied to accumulate the EEG feature node-level representations into graph-level representations. Finally, the obtained graph-level representations are fed into a temporal convolutional network (TCN) to extract the temporal dependencies between EEG frames. We evaluated our proposed ATGRNet on the SEED, DEAP and FACED datasets. The experimental findings show that the proposed ATGRNet surpasses the state-of-the-art graph-based mehtods for EEG-based emotion recognition.
由于情绪表达在中枢神经系统中的客观性,基于脑电图的情绪识别可以有效地反映人类的内部情绪状态。近年来,卷积神经网络 (CNN) 和循环神经网络 (RNN) 在从脑电图信号中提取局部特征和时间依赖性方面取得了重大进展。然而,CNN 忽略了脑电图电极的空间分布信息;此外,RNN 可能会遇到梯度爆炸/消失和高时间消耗等问题。为了解决这些限制,我们提出了一种基于注意力的时间图表示网络 (ATGRNet) 用于基于脑电图的情绪识别。首先,引入分层注意力机制,以整合 EEG 信号中按优先级排序的频带和通道的特征表示。其次,利用具有 top-k 操作的图卷积神经网络来捕获不同情绪模式下脑电图电极之间的内部关系。接下来,应用基于残差的图读取机制将 EEG 特征节点级表示累积到图级表示。最后,将获得的图级表示输入到时间卷积网络 (TCN) 中以提取 EEG 帧之间的时间依赖性。我们在 SEED、DEAP 和 FACED 数据集上评估了我们提出的 ATGRNet。实验结果表明,所提出的 ATGRNet 优于基于图的最新方法,用于基于脑电图的情绪识别。