School of Computer Science and Technology, Anhui University, Hefei 230601, China.
Institute of Physical Science and Information Technology, Anhui University, Hefei 230601, China.
Sensors (Basel). 2018 Aug 27;18(9):2826. doi: 10.3390/s18092826.
Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system.
面对青少年并检测他们的情绪状态对于促进电子医疗系统中的康复治疗至关重要。本研究专注于一种基于传感器的新型电子医疗系统,我们通过同步采集和分析眼电图(EOG)信号和眼动视频,提出了一种基于眼动信息的情绪感知算法。具体来说,我们首先通过短时傅里叶变换(STFT)对原始多通道 EOG 信号进行处理,提取时频眼动特征。随后,为了整合时域眼动特征(即扫视持续时间、注视持续时间和瞳孔直径),我们研究了两种特征融合策略:特征级融合(FLF)和决策级融合(DLF)。根据三种情绪状态(积极、中性和消极)进行了识别实验。平均准确率分别为 88.64%(FLF 方法)和 88.35%(最大规则法的 DLF)。实验结果表明,眼动信息可以有效地反映青少年的情绪状态,为提高电子医疗系统的性能提供了一种有前途的工具。