• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

介绍ACASS:用于可控(电子)运动感知研究的带注释的角色动画刺激集。

Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies.

作者信息

Lammers Sebastian, Bente Gary, Tepest Ralf, Jording Mathis, Roth Daniel, Vogeley Kai

机构信息

Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.

Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany.

出版信息

Front Robot AI. 2019 Sep 27;6:94. doi: 10.3389/frobt.2019.00094. eCollection 2019.

DOI:10.3389/frobt.2019.00094
PMID:33501109
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7805965/
Abstract

Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can derive from low-level physical features (e.g., luminance), as well as from higher-level psychological features (e.g., stimulus difficulty). Here we present a standardized stimulus dataset, which allows to address both action and emotion recognition with identical stimuli. The stimulus set consists of 792 computer animations with a neutral avatar based on full body motion capture protocols. Motion capture was performed on 22 human volunteers, instructed to perform six everyday activities (mopping, sweeping, painting with a roller, painting with a brush, wiping, sanding) in three different moods (angry, happy, sad). Five-second clips of each motion protocol were rendered into AVI-files using two virtual camera perspectives for each clip. In contrast to video stimuli, the computer animations allowed to standardize the physical appearance of the avatar and to control lighting and coloring conditions, thus reducing the stimulus variation to mere movement. To control for low level optical features of the stimuli, we developed and applied a set of MATLAB routines extracting basic physical features of the stimuli, including average background-foreground proportion and frame-by-frame pixel change dynamics. This information was used to identify outliers and to homogenize the stimuli across action and emotion categories. This led to a smaller stimulus subset ( = 83 animations within the 792 clip database) which only contained two different actions (mopping, sweeping) and two different moods (angry, happy). To further homogenize this stimulus subset with regard to psychological criteria we conducted an online observer study ( = 112 participants) to assess the recognition rates for actions and moods, which led to a final sub-selection of 32 clips (eight per category) within the database. The ACASS database and its subsets provide unique opportunities for research applications in social psychology, social neuroscience, and applied clinical studies on communication disorders. All 792 AVI-files, selected subsets, MATLAB code, annotations, and motion capture data (FBX-files) are available online.

摘要

他人的动作能让我们了解他们当前的活动、意图和情绪。由于缺乏合适的对比刺激材料,对动作识别和情绪推断背后不同机制的研究一直有限。有问题的混淆因素可能源于低层次的物理特征(如亮度),也可能源于高层次的心理特征(如刺激难度)。在此,我们展示了一个标准化的刺激数据集,它能够使用相同的刺激来处理动作和情绪识别。该刺激集由792个基于全身动作捕捉协议的中性虚拟角色计算机动画组成。对22名人类志愿者进行了动作捕捉,他们被要求在三种不同情绪(愤怒、高兴、悲伤)下执行六项日常活动(拖地、扫地、用滚筒刷漆、用刷子刷漆、擦拭、打磨)。每个动作协议的五秒片段使用两个虚拟相机视角渲染为AVI文件。与视频刺激不同,计算机动画能够使虚拟角色的外观标准化,并控制光照和色彩条件,从而将刺激变化仅减少到动作本身。为了控制刺激的低层次光学特征,我们开发并应用了一组MATLAB程序来提取刺激的基本物理特征,包括平均背景 - 前景比例和逐帧像素变化动态。这些信息用于识别异常值,并使跨动作和情绪类别的刺激均匀化。这产生了一个较小的刺激子集(792个片段数据库中的83个动画),其中只包含两种不同的动作(拖地、扫地)和两种不同的情绪(愤怒、高兴)。为了在心理标准方面进一步使这个刺激子集均匀化,我们进行了一项在线观察者研究(112名参与者)来评估动作和情绪的识别率,这导致在数据库中最终选出32个片段(每个类别8个)。ACASS数据库及其子集为社会心理学、社会神经科学以及沟通障碍应用临床研究中的研究应用提供了独特的机会。所有792个AVI文件、选定的子集、MATLAB代码、注释和动作捕捉数据(FBX文件)均可在线获取。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/ec0fe5064fa3/frobt-06-00094-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/3b9c22866a6a/frobt-06-00094-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/af3800ea1b29/frobt-06-00094-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/eee41a43ddcb/frobt-06-00094-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/9438c971d06a/frobt-06-00094-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/366f0bf2cdea/frobt-06-00094-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/6178af0099c4/frobt-06-00094-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/8676d443bdf8/frobt-06-00094-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/078d69778e4e/frobt-06-00094-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/bcb9ce01fab5/frobt-06-00094-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/ec0fe5064fa3/frobt-06-00094-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/3b9c22866a6a/frobt-06-00094-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/af3800ea1b29/frobt-06-00094-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/eee41a43ddcb/frobt-06-00094-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/9438c971d06a/frobt-06-00094-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/366f0bf2cdea/frobt-06-00094-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/6178af0099c4/frobt-06-00094-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/8676d443bdf8/frobt-06-00094-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/078d69778e4e/frobt-06-00094-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/bcb9ce01fab5/frobt-06-00094-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/57ba/7805965/ec0fe5064fa3/frobt-06-00094-g0010.jpg

相似文献

1
Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies.介绍ACASS:用于可控(电子)运动感知研究的带注释的角色动画刺激集。
Front Robot AI. 2019 Sep 27;6:94. doi: 10.3389/frobt.2019.00094. eCollection 2019.
2
Performance-driven facial animation: basic research on human judgments of emotional state in facial avatars.基于表现的面部动画:关于人类对面部虚拟形象情绪状态判断的基础研究。
Cyberpsychol Behav. 2001 Aug;4(4):471-87. doi: 10.1089/109493101750527033.
3
A motion capture library for the study of identity, gender, and emotion perception from biological motion.
Behav Res Methods. 2006 Feb;38(1):134-41. doi: 10.3758/bf03192758.
4
A dyadic stimulus set of audiovisual affective displays for the study of multisensory, emotional, social interactions.用于多感官、情感、社会互动研究的视听情感展示二元刺激集。
Behav Res Methods. 2016 Dec;48(4):1285-1295. doi: 10.3758/s13428-015-0654-4.
5
Controlling Video Stimuli in Sign Language and Gesture Research: The Package for Analyzing Motion-Tracking Data in .手语和手势研究中的视频刺激控制:用于分析运动跟踪数据的软件包
Front Psychol. 2021 Feb 19;12:628728. doi: 10.3389/fpsyg.2021.628728. eCollection 2021.
6
Beyond Stereotypes: Analyzing Gender and Cultural Differences in Nonverbal Rapport.超越刻板印象:分析非语言融洽关系中的性别与文化差异
Front Psychol. 2020 Dec 11;11:599703. doi: 10.3389/fpsyg.2020.599703. eCollection 2020.
7
Film clips and narrative text as subjective emotion elicitation techniques.电影片段和叙事文本作为主观情感诱发技术。
J Soc Psychol. 2017;157(2):194-210. doi: 10.1080/00224545.2016.1208138. Epub 2016 Jul 6.
8
Social Perception and Interaction Database-A Novel Tool to Study Social Cognitive Processes With Point-Light Displays.社会感知与互动数据库——一种利用点光显示研究社会认知过程的新型工具。
Front Psychiatry. 2020 Mar 11;11:123. doi: 10.3389/fpsyt.2020.00123. eCollection 2020.
9
The MPI emotional body expressions database for narrative scenarios.用于叙事场景的MPI情感身体表达数据库。
PLoS One. 2014 Dec 2;9(12):e113647. doi: 10.1371/journal.pone.0113647. eCollection 2014.
10
How Do We Recognize Emotion From Movement? Specific Motor Components Contribute to the Recognition of Each Emotion.我们如何从动作中识别情绪?特定的运动成分有助于对每种情绪的识别。
Front Psychol. 2019 Jul 3;10:1389. doi: 10.3389/fpsyg.2019.01389. eCollection 2019.

引用本文的文献

1
Integrating media content analysis, reception analysis, and media effects studies.整合媒体内容分析、受众分析和媒体效果研究。
Front Neurosci. 2023 Apr 27;17:1155750. doi: 10.3389/fnins.2023.1155750. eCollection 2023.

本文引用的文献

1
Distinct functional roles of the mirror neuron system and the mentalizing system.镜像神经元系统和心理理论系统的不同功能作用。
Neuroimage. 2019 Nov 15;202:116102. doi: 10.1016/j.neuroimage.2019.116102. Epub 2019 Aug 22.
2
Two social brains: neural mechanisms of intersubjectivity.两个社会大脑:主体间性的神经机制
Philos Trans R Soc Lond B Biol Sci. 2017 Aug 19;372(1727). doi: 10.1098/rstb.2016.0245.
3
Decoding intentions from movement kinematics.从运动运动学中解码意图。
Sci Rep. 2016 Nov 15;6:37036. doi: 10.1038/srep37036.
4
Interpersonal predictive coding, not action perception, is impaired in autism.在自闭症中,人际预测编码而非动作感知受损。
Philos Trans R Soc Lond B Biol Sci. 2016 May 5;371(1693). doi: 10.1098/rstb.2015.0373.
5
Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective.这是真的吗?虚拟角色与人类的认知:情感认知神经科学视角。
Front Psychol. 2015 May 12;6:576. doi: 10.3389/fpsyg.2015.00576. eCollection 2015.
6
Engaged listeners: shared neural processing of powerful political speeches.投入的听众:有力政治演讲的共同神经处理。
Soc Cogn Affect Neurosci. 2015 Aug;10(8):1137-43. doi: 10.1093/scan/nsu168. Epub 2015 Feb 3.
7
Atypical cross talk between mentalizing and mirror neuron networks in autism spectrum disorder.自闭症谱系障碍中心理理论和镜像神经元网络之间的异常串扰。
JAMA Psychiatry. 2014 Jul 1;71(7):751-60. doi: 10.1001/jamapsychiatry.2014.83.
8
AMAB: automated measurement and analysis of body motion.AMAB:身体运动的自动测量和分析。
Behav Res Methods. 2014 Sep;46(3):625-33. doi: 10.3758/s13428-013-0398-y.
9
Expression of emotion in the kinematics of locomotion.运动动力学中的情感表达。
Exp Brain Res. 2013 Mar;225(2):159-76. doi: 10.1007/s00221-012-3357-4. Epub 2012 Dec 19.
10
Body cues, not facial expressions, discriminate between intense positive and negative emotions.身体线索而非面部表情可以区分强烈的正性和负性情绪。
Science. 2012 Nov 30;338(6111):1225-9. doi: 10.1126/science.1224313.