Department of Electrical and Electronics Engineering, BITS-Pilani, Hyderabad Campus, Hyderabad, 500078, India.
Department of Electrical and Electronics Engineering, BITS-Pilani, Hyderabad Campus, Hyderabad, 500078, India.
Comput Biol Med. 2021 Jul;134:104428. doi: 10.1016/j.compbiomed.2021.104428. Epub 2021 May 6.
Emotion is interpreted as a psycho-physiological process, and it is associated with personality, behavior, motivation, and character of a person. The objective of affective computing is to recognize different types of emotions for human-computer interaction (HCI) applications. The spatiotemporal brain electrical activity is measured using multi-channel electroencephalogram (EEG) signals. Automated emotion recognition using multi-channel EEG signals is an exciting research topic in cognitive neuroscience and affective computing. This paper proposes the rhythm-specific multi-channel convolutional neural network (CNN) based approach for automated emotion recognition using multi-channel EEG signals. The delta (δ), theta (θ), alpha (α), beta (β), and gamma (γ) rhythms of EEG signal for each channel are evaluated using band-pass filters. The EEG rhythms from the selected channels coupled with deep CNN are used for emotion classification tasks such as low-valence (LV) vs. high valence (HV), low-arousal (LA) vs. high-arousal (HA), and low-dominance (LD) vs. high dominance (HD) respectively. The deep CNN architecture considered in the proposed work has eight convolutions, three average pooling, four batch-normalization, three spatial drop-outs, two drop-outs, one global average pooling and, three dense layers. We have validated our developed model using three publicly available databases: DEAP, DREAMER, and DASPS. The results reveal that the proposed multivariate deep CNN approach coupled with β-rhythm has obtained the accuracy values of 98.91%, 98.45%, and 98.69% for LV vs. HV, LA vs. HA, and LD vs. HD emotion classification strategies, respectively using DEAP database with 10-fold cross-validation (CV) scheme. Similarly, the accuracy values of 98.56%, 98.82%, and 98.99% are obtained for LV vs. HV, LA vs. HA, and LD vs. HD classification schemes, respectively, using deep CNN and θ-rhythm. The proposed multi-channel rhythm-specific deep CNN classification model has obtained the average accuracy value of 57.14% using α-rhythm and trial-specific CV using DASPS database. Moreover, for 8-quadrant based emotion classification strategy, the deep CNN based classifier has obtained an overall accuracy value of 24.37% using γ-rhythms of multi-channel EEG signals. Our developed deep CNN model can be used for real-time automated emotion recognition applications.
情绪被解释为一种心理生理过程,它与个性、行为、动机和人的性格有关。情感计算的目的是识别用于人机交互 (HCI) 应用的不同类型的情感。使用多通道脑电图 (EEG) 信号测量时空脑电活动。使用多通道 EEG 信号的自动情感识别是认知神经科学和情感计算中的一个令人兴奋的研究课题。本文提出了基于节律特异性多通道卷积神经网络 (CNN) 的方法,用于使用多通道 EEG 信号进行自动情感识别。使用带通滤波器评估每个通道的 EEG 信号的δ(δ)、θ(θ)、α(α)、β(β)和γ(γ)节律。选择通道的 EEG 节律与深度 CNN 相结合,用于情绪分类任务,例如低效价 (LV) 与高效价 (HV)、低唤醒 (LA) 与高唤醒 (HA) 以及低主导 (LD) 与高主导 (HD)。所考虑的深度 CNN 架构在拟议的工作中有八个卷积,三个平均池化,四个批量归一化,三个空间辍学,两个辍学,一个全局平均池化和三个密集层。我们使用三个公开可用的数据库:DEAP、DREAMER 和 DASPS 验证了我们开发的模型。结果表明,使用 10 倍交叉验证 (CV) 方案的 DEAP 数据库,使用β节律的提出的多变量深度 CNN 方法分别获得了 LV 与 HV、LA 与 HA 和 LD 与 HD 情绪分类策略的 98.91%、98.45%和 98.69%的准确率。类似地,使用深度 CNN 和θ节律分别获得了 LV 与 HV、LA 与 HA 和 LD 与 HD 分类方案的 98.56%、98.82%和 98.99%的准确率。提出的多通道节律特异性深度 CNN 分类模型使用 DASPS 数据库的α节律和基于试验的 CV 获得了平均准确率 57.14%。此外,对于基于 8 象限的情绪分类策略,基于深度 CNN 的分类器使用多通道 EEG 信号的γ节律获得了总体准确率 24.37%。我们开发的深度 CNN 模型可用于实时自动情感识别应用。