Tan Jun-Wen, Andrade Adriano O, Li Hang, Walter Steffen, Hrabal David, Rukavina Stefanie, Limbrecht-Ecklundt Kerstin, Hoffman Holger, Traue Harald C
College of Teacher Education, Lishui University, Lishui, P.R. China.
Biomedical Engineering Laboratory, Faculty of Electrical Engineering, Federal University of Uberlândia, Uberlândia, Brazil.
PLoS One. 2016 Jan 13;11(1):e0146691. doi: 10.1371/journal.pone.0146691. eCollection 2016.
BACKGROUND: Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity. METHODS: Ten blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM). RESULTS: We observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants. CONCLUSION: Our research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS).
背景:研究表明,人类与数字环境之间的互动除了带来技术便利外,还具有一种陪伴形式的特征。为此,人类试图设计出能够明显理解人类情感体验的计算机系统。面部肌电图(EMG)就是这样一种使机器能够了解人类情感状态的技术。许多研究调查了效价情绪对皱眉肌(皱眉肌肉)和颧大肌(微笑肌肉)捕捉到的面部肌电活动的影响。然而,唤醒情绪尤其没有得到太多的研究关注。在本研究中,我们试图通过面部肌电活动识别强烈的效价和唤醒情感状态。 方法:将十组情感图片分为五类:中性效价/低唤醒(0VLA)、正性效价/高唤醒(PVHA)、负性效价/高唤醒(NVHA)、正性效价/低唤醒(PVLA)和负性效价/低唤醒(NVLA),并详细研究了每一类图片引发相应效价和唤醒情感状态的能力。113名参与者接受了这些刺激并提供了面部肌电图。基于信号的幅度、频率、可预测性和变异性定义了一组16个特征,并使用支持向量机(SVM)进行分类。 结果:我们观察到,基于皱眉肌和颧大肌肌电图的组合,所有个体在基线和五种情感状态(0VLA、PVHA、PVLA、NVHA和NVLA)下的分类准确率都很高,范围从75.69%到100.00%。老年人和年轻人之间的分类准确率存在显著差异,但女性和男性参与者之间没有显著差异。 结论:我们的研究为识别年轻人和老年人强烈的效价和唤醒情感状态提供了有力证据。这些发现有助于面部肌电图在未来成功应用于人机交互(HMI)或陪伴机器人系统(CRS)中识别用户情感状态。
Int J Psychophysiol. 2010-12-24
Sci Rep. 2022-10-7
Sci Rep. 2021-3-11
Psychophysiology. 2009-7
Nutrients. 2020-4-22
Int J Psychophysiol. 2010-12-24
Soc Cogn Affect Neurosci. 2010-4-12
J Gerontol B Psychol Sci Soc Sci. 2010-1-6
J Gerontol B Psychol Sci Soc Sci. 2010-1-6
Cogn Emot. 2009-2-1
Psychophysiology. 2009-7