Biomedical Signal Interpretation and Computational Simulation (BSICoS), Aragón Institute for Engineering Research (I3A), IIS Aragón, University of Zaragoza, Spain, María de Luna, 1, 50015 Zaragoza, Spain. Department ESAII, Centre for Biomedical Engineering Research, Universitat Politècnica de Catalunya, Barcelona, 08028, Spain. Aragón Institute for Engineering Research (I3A), IIS Aragón, University of Zaragoza, 50018 Zaragoza, Spain. Author to whom any correspondence should be addressed. Biomedical Signal Interpretation and Computational Simulation (BSICoS), Aragón Institute for Engineering Research (I3A), IIS Aragón, University of Zaragoza, Spain, Mara de Luna, 1, 50015 Zaragoza, Spain.
Physiol Meas. 2019 Sep 3;40(8):084001. doi: 10.1088/1361-6579/ab310a.
Interest in emotion recognition has increased in recent years as a useful tool for diagnosing psycho-neural illnesses. In this study, the auto-mutual and the cross-mutual information function, AMIF and CMIF respectively, are used for human emotion recognition.
The AMIF technique was applied to heart rate variability (HRV) signals to study complex interdependencies, and the CMIF technique was considered to quantify the complex coupling between HRV and respiratory signals. Both algorithms were adapted to short-term RR time series. Traditional band pass filtering was applied to the RR series at low frequency (LF) and high frequency (HF) bands, and a respiration-based filter bandwidth was also investigated ([Formula: see text]). Both the AMIF and the CMIF algorithms were calculated with regard to different time scales as specific complexity measures. The ability of the parameters derived from the AMIF and the CMIF to discriminate emotions was evaluated on a database of video-induced emotion elicitation. Five elicited states i.e. relax (neutral), joy (positive valence), as well as fear, sadness and anger (negative valences) were considered.
The results revealed that the AMIF applied to the RR time series filtered in the [Formula: see text] band was able to discriminate between the following: relax and joy and fear, joy and each negative valence conditions, and finally fear and sadness and anger, all with a statistical significance level p -value [Formula: see text] 0.05, sensitivity, specificity and accuracy higher than 70% and area under the receiver operating characteristic curve index AUC [Formula: see text]0.70. Furthermore, the parameters derived from the AMIF and the CMIF allowed the low signal complexity presented during fear to be characterized in front of any of the studied elicited states.
Based on these results, human emotion manifested in the HRV and respiratory signal responses could be characterized by means of the information-content complexity.
近年来,人们对情绪识别的兴趣日益浓厚,将其作为诊断心理神经疾病的有用工具。在这项研究中,分别使用自互信息函数(AMIF)和互互信息函数(CMIF)进行人类情绪识别。
将 AMIF 技术应用于心率变异性(HRV)信号,以研究复杂的相互依存关系,同时考虑 CMIF 技术来量化 HRV 和呼吸信号之间的复杂耦合。这两种算法都适用于短期 RR 时间序列。传统的带通滤波应用于 RR 系列的低频(LF)和高频(HF)带,同时也研究了基于呼吸的滤波器带宽 ([Formula: see text])。AMIF 和 CMIF 算法都在不同的时间尺度下计算,作为特定的复杂度度量。通过视频诱导情绪诱发数据库评估从 AMIF 和 CMIF 导出的参数区分情绪的能力。考虑了五种诱发状态,即放松(中性)、喜悦(正效价),以及恐惧、悲伤和愤怒(负效价)。
结果表明,应用于 RR 时间序列并在 [Formula: see text] 带滤波的 AMIF 能够区分以下状态:放松和喜悦,恐惧和喜悦,以及恐惧和每一种负效价状态,所有这些差异均具有统计学意义(p 值 [Formula: see text] 0.05),灵敏度、特异性和准确率均高于 70%,接收器操作特征曲线指数 AUC [Formula: see text]0.70。此外,从 AMIF 和 CMIF 导出的参数还可以描述在任何研究诱发状态下恐惧期间呈现的低信号复杂度。
基于这些结果,HRV 和呼吸信号反应中表现出的人类情绪可以通过信息内容复杂度来描述。