Wroclaw University of Science and Technology, Faculty of Information and Communication Technology, Department of Artificial Intelligence, Wrocław, 50-370, Poland.
Adam Mickiewicz University, Faculty of Psychology and Cognitive Science, Poznan, 61-664, Poland.
Sci Data. 2022 Apr 7;9(1):158. doi: 10.1038/s41597-022-01262-0.
The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals' high quality.
Emognition 数据集致力于测试从生理反应和面部表情识别情绪(ER)的方法。我们从 43 名参与者那里收集了数据,他们观看了引发九种离散情绪的短片:娱乐、敬畏、热情、喜欢、惊讶、愤怒、厌恶、恐惧和悲伤。三种可穿戴设备用于记录生理数据:EEG、BVP(2x)、HR、EDA、SKT、ACC(3x)和 GYRO(2x);与上半身视频并行。在每个电影片段之后,参与者完成了两种类型的自我报告:(1)与九种离散情绪有关,(2)三种情感维度:效价、唤醒和动机。获得的数据促进了各种 ER 方法的发展,例如,多模态 ER、基于 EEG 与心血管的 ER、离散到维度表示的转换。技术验证表明,观看电影片段引发了目标情绪。它还支持高质量的信号。