Fujimura Tomomi, Umemura Hiroyuki
a Human Informatics Research Institute , National Institute of Advanced Industrial Science and Technology (AIST) , Tsukuba , Japan.
Cogn Emot. 2018 Dec;32(8):1663-1670. doi: 10.1080/02699931.2017.1419936. Epub 2018 Jan 15.
The present study describes the development and validation of a facial expression database comprising five different horizontal face angles in dynamic and static presentations. The database includes twelve expression types portrayed by eight Japanese models. This database was inspired by the dimensional and categorical model of emotions: surprise, fear, sadness, anger with open mouth, anger with closed mouth, disgust with open mouth, disgust with closed mouth, excitement, happiness, relaxation, sleepiness, and neutral (static only). The expressions were validated using emotion classification and Affect Grid rating tasks [Russell, Weiss, & Mendelsohn, 1989. Affect Grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493-502]. The results indicate that most of the expressions were recognised as the intended emotions and could systematically represent affective valence and arousal. Furthermore, face angle and facial motion information influenced emotion classification and valence and arousal ratings. Our database will be available online at the following URL. https://www.dh.aist.go.jp/database/face2017/ .
本研究描述了一个面部表情数据库的开发与验证,该数据库包含动态和静态展示中五个不同的水平面部角度。该数据库包括由八位日本模特所呈现的十二种表情类型。这个数据库的灵感来源于情绪的维度和分类模型:惊讶、恐惧、悲伤、张嘴愤怒、闭嘴愤怒、张嘴厌恶、闭嘴厌恶、兴奋、快乐、放松、困倦,以及中性表情(仅静态)。这些表情通过情绪分类和情感量表评分任务进行了验证[拉塞尔、韦斯和门德尔松,1989年。情感量表:愉悦和唤醒的单项量表。《人格与社会心理学杂志》,57(3),493 - 502]。结果表明,大多数表情被识别为预期的情绪,并且能够系统地代表情感效价和唤醒程度。此外,面部角度和面部运动信息影响了情绪分类以及效价和唤醒程度评分。我们的数据库将在以下网址在线提供。https://www.dh.aist.go.jp/database/face2017/ 。