• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

uulmMAC 数据库——用于人机交互中情感计算的多模态情感语料库。

The uulmMAC Database-A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction.

机构信息

Section Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, Germany.

Institute of Neural Information Processing, University of Ulm, James-Frank-Ring, 89081 Ulm, Germany.

出版信息

Sensors (Basel). 2020 Apr 17;20(8):2308. doi: 10.3390/s20082308.

DOI:10.3390/s20082308
PMID:32316626
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7219061/
Abstract

In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing , and . Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the , consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.

摘要

本文提出了一个在人机交互 (HCI) 环境中获取的用于情感计算研究的多模态数据集。设计并实现了一个基于游戏化通用范例的实验性移动交互场景,用于诱导基于对话的 HCI 相关情感和认知负荷状态。它由六个实验序列组成,分别诱导、和。每个序列后都有主观反馈来验证诱导,有呼吸基线来稳定生理反应,还有结果总结。此外,在实验之前,从每个参与者那里收集了三个与情绪调节 (ERQ)、情绪控制 (TEIQue-SF) 和个性特征 (TIPI) 相关的问卷,以评估诱导范式的稳定性。基于这个 HCI 场景,生成了一个包含两个同质样本的 数据集,每个样本有 60 名参与者和 100 个记录会话。我们记录了 16 种传感器模式,包括 4×视频、3×音频和 7×生物物理、深度和姿势流。此外,还收集了其他标签和注释。记录后,所有数据都经过后处理,并检查技术和信号质量,最终得到 57 名参与者和 95 个记录会话的数据集。报告的主观反馈的评估显示序列之间存在显著差异,与诱导状态非常一致,问卷的分析显示结果稳定。总的来说,我们的数据库是情感计算和多模态数据分析领域的一个有价值的贡献:在接近真实 HCI 的移动交互场景中获取,它包含大量的参与者,并允许进行跨时间的研究。通过主观反馈进行验证,并检查质量问题,它可用于情感计算和机器学习应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/c0cb23ce0357/sensors-20-02308-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/bd2d11c84976/sensors-20-02308-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/de097f4ed5a3/sensors-20-02308-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/f204e4430ff4/sensors-20-02308-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/8e53ff335a94/sensors-20-02308-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/024faade48d1/sensors-20-02308-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/ff0feb017f32/sensors-20-02308-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/1fea5a3fb0ed/sensors-20-02308-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/840dd6d16596/sensors-20-02308-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/19c049dd29e1/sensors-20-02308-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/44622c3c6d62/sensors-20-02308-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/9e99b571460a/sensors-20-02308-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/16e931e3a291/sensors-20-02308-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/405629d91328/sensors-20-02308-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/1173c3d2ec02/sensors-20-02308-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/47df32fad11d/sensors-20-02308-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/35cacb4cd671/sensors-20-02308-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/c0cb23ce0357/sensors-20-02308-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/bd2d11c84976/sensors-20-02308-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/de097f4ed5a3/sensors-20-02308-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/f204e4430ff4/sensors-20-02308-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/8e53ff335a94/sensors-20-02308-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/024faade48d1/sensors-20-02308-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/ff0feb017f32/sensors-20-02308-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/1fea5a3fb0ed/sensors-20-02308-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/840dd6d16596/sensors-20-02308-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/19c049dd29e1/sensors-20-02308-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/44622c3c6d62/sensors-20-02308-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/9e99b571460a/sensors-20-02308-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/16e931e3a291/sensors-20-02308-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/405629d91328/sensors-20-02308-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/1173c3d2ec02/sensors-20-02308-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/47df32fad11d/sensors-20-02308-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/35cacb4cd671/sensors-20-02308-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02e2/7219061/c0cb23ce0357/sensors-20-02308-g015.jpg

相似文献

1
The uulmMAC Database-A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction.uulmMAC 数据库——用于人机交互中情感计算的多模态情感语料库。
Sensors (Basel). 2020 Apr 17;20(8):2308. doi: 10.3390/s20082308.
2
Computer-aided psychotherapy based on multimodal elicitation, estimation and regulation of emotion.基于多模态情感激发、估计和调节的计算机辅助心理治疗。
Psychiatr Danub. 2013 Sep;25(3):340-6.
3
A Hybrid Multimodal Emotion Recognition Framework for UX Evaluation Using Generalized Mixture Functions.基于广义混合函数的用于用户体验评估的混合多模态情感识别框架。
Sensors (Basel). 2023 Apr 28;23(9):4373. doi: 10.3390/s23094373.
4
Affective Computing and the Impact of Gender and Age.情感计算以及性别和年龄的影响。
PLoS One. 2016 Mar 3;11(3):e0150584. doi: 10.1371/journal.pone.0150584. eCollection 2016.
5
A Multimodal Dataset for Mixed Emotion Recognition.用于混合情绪识别的多模态数据集。
Sci Data. 2024 Aug 5;11(1):847. doi: 10.1038/s41597-024-03676-4.
6
[Research on the performance comparing and building of affective computing database based on physiological parameters].基于生理参数的情感计算数据库性能比较与构建研究
Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2014 Aug;31(4):782-7.
7
USER FRUSTRATION IN HIT INTERFACES: EXPLORING PAST HCI RESEARCH FOR A BETTER UNDERSTANDING OF CLINICIANS' EXPERIENCES.医疗信息技术界面中的用户挫败感:探索过往人机交互研究以更好地理解临床医生的体验。
AMIA Annu Symp Proc. 2015 Nov 5;2015:1008-17. eCollection 2015.
8
K-EmoPhone: A Mobile and Wearable Dataset with In-Situ Emotion, Stress, and Attention Labels.K-EmoPhone:具有原位情绪、压力和注意力标签的移动和可穿戴数据集。
Sci Data. 2023 Jun 2;10(1):351. doi: 10.1038/s41597-023-02248-2.
9
The Impacts of Attitudes and Engagement on Electronic Word of Mouth (eWOM) of Mobile Sensor Computing Applications.态度与参与度对移动传感器计算应用电子口碑(eWOM)的影响
Sensors (Basel). 2016 Mar 18;16(3):391. doi: 10.3390/s16030391.
10
A Review of Emotion Recognition Methods Based on Data Acquired via Smartphone Sensors.基于智能手机传感器获取数据的情感识别方法综述。
Sensors (Basel). 2020 Nov 8;20(21):6367. doi: 10.3390/s20216367.

引用本文的文献

1
: A Multimodal Dataset for Cognitive Load Estimation.用于认知负荷估计的多模态数据集。
Sensors (Basel). 2022 Dec 28;23(1):340. doi: 10.3390/s23010340.
2
Emotion and Stress Recognition Related Sensors and Machine Learning Technologies.情绪和压力识别相关传感器和机器学习技术。
Sensors (Basel). 2021 Mar 24;21(7):2273. doi: 10.3390/s21072273.
3
Multi-Path and Group-Loss-Based Network for Speech Emotion Recognition in Multi-Domain Datasets.基于多路径和群组损失的网络在多领域数据集的语音情感识别。

本文引用的文献

1
Estimating cognitive load during self-regulation of brain activity and neurofeedback with therapeutic brain-computer interfaces.使用治疗性脑机接口在大脑活动自我调节和神经反馈过程中估计认知负荷。
Front Behav Neurosci. 2015 Feb 16;9:21. doi: 10.3389/fnbeh.2015.00021. eCollection 2015.
2
The view from the road: the contribution of on-road glance-monitoring technologies to understanding driver behavior.从道路视角看:基于路上扫视监测技术对理解驾驶员行为的贡献。
Accid Anal Prev. 2013 Sep;58:175-86. doi: 10.1016/j.aap.2013.02.008. Epub 2013 Feb 27.
3
A computerized multidimensional measurement of mental workload via handwriting analysis.
Sensors (Basel). 2021 Feb 24;21(5):1579. doi: 10.3390/s21051579.
4
Arousal Detection in Elderly People from Electrodermal Activity Using Musical Stimuli.使用音乐刺激进行老年人的觉醒度检测。
Sensors (Basel). 2020 Aug 25;20(17):4788. doi: 10.3390/s20174788.
通过笔迹分析实现计算机化的多维脑力负荷测量。
Behav Res Methods. 2012 Jun;44(2):575-86. doi: 10.3758/s13428-011-0159-8.
4
Development and evaluation of an ambulatory stress monitor based on wearable sensors.基于可穿戴传感器的动态压力监测仪的研发与评估
IEEE Trans Inf Technol Biomed. 2012 Mar;16(2):279-86. doi: 10.1109/TITB.2011.2169804. Epub 2011 Sep 29.
5
Effects of visual and verbal presentation on cognitive load in vigilance, memory, and arithmetic tasks.视觉和言语呈现对警觉、记忆和算术任务认知负荷的影响。
Psychophysiology. 2011 Mar;48(3):323-32. doi: 10.1111/j.1469-8986.2010.01069.x. Epub 2010 Aug 16.
6
A psychometric analysis of the Trait Emotional Intelligence Questionnaire-Short Form (TEIQue-SF) using item response theory.采用项目反应理论对特质情绪智力量表-短式问卷(TEIQue-SF)进行心理计量分析。
J Pers Assess. 2010 Sep;92(5):449-57. doi: 10.1080/00223891.2010.497426.
7
Tuning down the emotional brain: an fMRI study of the effects of cognitive load on the processing of affective images.调低情感脑区:一项关于认知负荷对情感图像加工影响的功能磁共振成像研究
Neuroimage. 2009 May 1;45(4):1212-9. doi: 10.1016/j.neuroimage.2009.01.016. Epub 2009 Jan 24.
8
Individual differences in two emotion regulation processes: implications for affect, relationships, and well-being.两种情绪调节过程中的个体差异:对情感、人际关系和幸福感的影响。
J Pers Soc Psychol. 2003 Aug;85(2):348-62. doi: 10.1037/0022-3514.85.2.348.
9
The 'Trier Social Stress Test'--a tool for investigating psychobiological stress responses in a laboratory setting.“特里尔社会应激测试”——一种用于在实验室环境中研究心理生物学应激反应的工具。
Neuropsychobiology. 1993;28(1-2):76-81. doi: 10.1159/000119004.