Department of Electronics & Communication, Faculty of Engineering, Misr International University (MIU), Heliopolis, Cairo P.O. Box 1 , Egypt.
Sensors (Basel). 2023 Jan 21;23(3):1255. doi: 10.3390/s23031255.
Emotion artificial intelligence (AI) is being increasingly adopted in several industries such as healthcare and education. Facial expressions and tone of speech have been previously considered for emotion recognition, yet they have the drawback of being easily manipulated by subjects to mask their true emotions. Electroencephalography (EEG) has emerged as a reliable and cost-effective method to detect true human emotions. Recently, huge research effort has been put to develop efficient wearable EEG devices to be used by consumers in out of the lab scenarios. In this work, a subject-dependent emotional valence recognition method is implemented that is intended for utilization in emotion AI applications. Time and frequency features were computed from a single time series derived from the Fp1 and Fp2 channels. Several analyses were performed on the strongest valence emotions to determine the most relevant features, frequency bands, and EEG timeslots using the benchmark DEAP dataset. Binary classification experiments resulted in an accuracy of 97.42% using the alpha band, by that outperforming several approaches from literature by ~3-22%. Multiclass classification gave an accuracy of 95.0%. Feature computation and classification required less than 0.1 s. The proposed method thus has the advantage of reduced computational complexity as, unlike most methods in the literature, only two EEG channels were considered. In addition, minimal features concluded from the thorough analyses conducted in this study were used to achieve state-of-the-art performance. The implemented EEG emotion recognition method thus has the merits of being reliable and easily reproducible, making it well-suited for wearable EEG devices.
情感人工智能(AI)在医疗和教育等多个行业得到了越来越多的应用。面部表情和语音语调以前被认为是情感识别的方法,但它们的缺点是容易被被试者操纵,以掩盖他们的真实情感。脑电图(EEG)已成为一种可靠且具有成本效益的方法,可用于检测真实的人类情感。最近,人们投入了大量的研究努力来开发高效的可穿戴 EEG 设备,以便消费者在实验室外的场景中使用。在这项工作中,实现了一种基于被试者的情感效价识别方法,旨在用于情感 AI 应用。从 Fp1 和 Fp2 通道的单个时间序列中计算了时间和频率特征。使用基准 DEAP 数据集对最强效价情绪进行了几次分析,以确定最相关的特征、频带和 EEG 时隙。使用 alpha 频带进行的二进制分类实验的准确率为 97.42%,优于文献中的几种方法约 3-22%。多类分类的准确率为 95.0%。特征计算和分类所需时间不到 0.1 秒。因此,与文献中的大多数方法不同,该方法仅考虑两个 EEG 通道,具有计算复杂度降低的优势。此外,从本研究中进行的彻底分析中得出的最小特征实现了最先进的性能。因此,所实现的 EEG 情感识别方法具有可靠且易于重现的优点,非常适合可穿戴 EEG 设备。