• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

中国客户与服务人员互动情感系统(CCSIAS):多模态刺激数据集介绍

The Chinese customers and service staff interactive affective system (CCSIAS): introduction to a multimodal stimulus dataset.

作者信息

Liu Ping, Zhang Yi, Xiong Ziyue, Gao Ying

机构信息

School of Business, Sichuan University, Chengdu, China.

出版信息

Front Psychol. 2024 May 3;15:1302253. doi: 10.3389/fpsyg.2024.1302253. eCollection 2024.

DOI:10.3389/fpsyg.2024.1302253
PMID:38765835
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11099906/
Abstract

To research the emotional interaction between customers and service staff, single-modal stimuli are being used to activate subjects' emotions while multimodal emotion stimuli with better efficiency are often neglected. This study aims to construct a multimodal emotion stimuli database (CCSIAS) with video records of real work status of 29 service staff and audio clips of interactions between customers and service staff by setting up wide-angle cameras and searching in company's Ocean Engine for 15 consecutive days. First, we developed a tool to assess the emotional statuses of customers and service staff in Study 1. Second, 40 Masters and PhD students were invited to assess the audio and video data to evaluate the emotional states of customers and service staff in Study 2, using the tools developed in Study 1. Third, 118 participants were recruited to test the results from Study 2 to ensure the stability of the derived data. The results showed that 139 sets of stable emotional audio & video data were constructed (26 sets were high, 59 sets were medium and 54 sets were low). The amount of emotional information is important for the effective activation of participants' emotional states, and the degree of emotional activation of video data is significantly higher than that of the audio data. Overall, it was shown that the study of emotional interaction phenomena requires a multimodal dataset. The CCSIAS (https://osf.io/muc86/) can extend the depth and breadth of emotional interaction research and can be applied to different emotional states between customers and service staff activation in the fields of organizational behavior and psychology.

摘要

为了研究顾客与服务人员之间的情感互动,目前使用的是单模态刺激来激活受试者的情绪,而效率更高的多模态情感刺激却常常被忽视。本研究旨在通过设置广角摄像头并在公司的巨量引擎中连续搜索15天,构建一个多模态情感刺激数据库(CCSIAS),该数据库包含29名服务人员的实际工作状态视频记录以及顾客与服务人员之间的互动音频片段。首先,我们在研究1中开发了一种工具来评估顾客和服务人员的情绪状态。其次,在研究2中,邀请了40名硕士和博士生使用研究1中开发的工具来评估音频和视频数据,以评估顾客和服务人员的情绪状态。第三,招募了118名参与者来测试研究2的结果,以确保所获数据的稳定性。结果表明,构建了139组稳定的情感音频和视频数据(26组为高强度,59组为中等强度,54组为低强度)。情感信息量对于有效激活参与者的情绪状态很重要,并且视频数据的情绪激活程度明显高于音频数据。总体而言,研究表明情感互动现象的研究需要多模态数据集。CCSIAS(https://osf.io/muc86/)可以扩展情感互动研究的深度和广度,并可应用于组织行为学和心理学领域中激活顾客与服务人员之间不同的情绪状态。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/34033790c301/fpsyg-15-1302253-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/141dda475361/fpsyg-15-1302253-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/f1f23dfe87fd/fpsyg-15-1302253-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/6b1a805f6d9e/fpsyg-15-1302253-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/713ba43f5798/fpsyg-15-1302253-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/b24414c158f0/fpsyg-15-1302253-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/c2590518aff9/fpsyg-15-1302253-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/443a9288f6d7/fpsyg-15-1302253-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/2d9254eab6c1/fpsyg-15-1302253-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/34033790c301/fpsyg-15-1302253-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/141dda475361/fpsyg-15-1302253-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/f1f23dfe87fd/fpsyg-15-1302253-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/6b1a805f6d9e/fpsyg-15-1302253-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/713ba43f5798/fpsyg-15-1302253-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/b24414c158f0/fpsyg-15-1302253-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/c2590518aff9/fpsyg-15-1302253-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/443a9288f6d7/fpsyg-15-1302253-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/2d9254eab6c1/fpsyg-15-1302253-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2547/11099906/34033790c301/fpsyg-15-1302253-g009.jpg

相似文献

1
The Chinese customers and service staff interactive affective system (CCSIAS): introduction to a multimodal stimulus dataset.中国客户与服务人员互动情感系统(CCSIAS):多模态刺激数据集介绍
Front Psychol. 2024 May 3;15:1302253. doi: 10.3389/fpsyg.2024.1302253. eCollection 2024.
2
Judging the emotional states of customer service staff in the workplace: A multimodal dataset analysis.判断工作场所中客服人员的情绪状态:多模态数据集分析。
Front Psychol. 2022 Nov 11;13:1001885. doi: 10.3389/fpsyg.2022.1001885. eCollection 2022.
3
The uulmMAC Database-A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction.uulmMAC 数据库——用于人机交互中情感计算的多模态情感语料库。
Sensors (Basel). 2020 Apr 17;20(8):2308. doi: 10.3390/s20082308.
4
Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study.多模态情感刺激对大脑活动的影响:一项脑电图研究。
Sensors (Basel). 2023 May 16;23(10):4801. doi: 10.3390/s23104801.
5
CREMA-D: Crowd-sourced Emotional Multimodal Actors Dataset.CREMA-D:众包情感多模态演员数据集。
IEEE Trans Affect Comput. 2014 Oct-Dec;5(4):377-390. doi: 10.1109/TAFFC.2014.2336244.
6
Automatic detection of service initiation signals used in bars.酒吧中服务启动信号的自动检测。
Front Psychol. 2013 Aug 30;4:557. doi: 10.3389/fpsyg.2013.00557. eCollection 2013.
7
Breaking the trade-off between efficiency and service.打破效率与服务之间的权衡。
Harv Bus Rev. 2006 Nov;84(11):93-101, 156.
8
Method for analyzing sequential services using EEG: Micro-meso analysis of emotional changes in real flight service.利用 EEG 分析连续服务的方法:真实飞行服务中情绪变化的微观-中观分析。
Physiol Behav. 2023 Dec 1;272:114359. doi: 10.1016/j.physbeh.2023.114359. Epub 2023 Sep 26.
9
Comparison of response to Chinese and Western videos of mental-health-related emotions in a representative Chinese sample.在中国代表性样本中,对与心理健康相关情绪的中文和西方视频的反应比较。
PeerJ. 2021 Jan 19;9:e10440. doi: 10.7717/peerj.10440. eCollection 2021.
10
A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data.基于多模态传感器数据的自动面部表情识别系统综述。
Sensors (Basel). 2019 Apr 18;19(8):1863. doi: 10.3390/s19081863.

本文引用的文献

1
Judging the emotional states of customer service staff in the workplace: A multimodal dataset analysis.判断工作场所中客服人员的情绪状态:多模态数据集分析。
Front Psychol. 2022 Nov 11;13:1001885. doi: 10.3389/fpsyg.2022.1001885. eCollection 2022.
2
The taste & affect music database: Subjective rating norms for a new set of musical stimuli.《味道与情感音乐数据库:一整套新音乐刺激的主观评价规范》
Behav Res Methods. 2023 Apr;55(3):1121-1140. doi: 10.3758/s13428-022-01862-z. Epub 2022 May 17.
3
The Social Effects of Emotions.情绪的社会影响。
Annu Rev Psychol. 2022 Jan 4;73:629-658. doi: 10.1146/annurev-psych-020821-010855. Epub 2021 Jul 19.
4
Facial Expression Recognition in Videos using Dynamic Kernels.使用动态内核的视频面部表情识别
IEEE Trans Image Process. 2020 Jul 30;PP. doi: 10.1109/TIP.2020.3011846.
5
A database of news videos for investigating the dynamics of emotion and memory.一个用于研究情感与记忆动态变化的新闻视频数据库。
Behav Res Methods. 2020 Aug;52(4):1469-1479. doi: 10.3758/s13428-019-01327-w.
6
PiSCES: Pictures with social context and emotional scenes with norms for emotional valence, intensity, and social engagement.PiSCES:带有社会背景和情感场景的图片,以及用于情感效价、强度和社会参与的规范。
Behav Res Methods. 2018 Oct;50(5):1793-1805. doi: 10.3758/s13428-017-0947-x.
7
The state of the heart: Emotional labor as emotion regulation reviewed and revised.内心状态:作为情绪调节的情感劳动回顾与修正
J Occup Health Psychol. 2017 Jul;22(3):407-422. doi: 10.1037/ocp0000067. Epub 2017 Feb 2.
8
Norms of valence, arousal, concreteness, familiarity, imageability, and context availability for 1,100 Chinese words.1100个中文词汇的效价、唤醒度、具体性、熟悉度、可想象性和语境可用性规范
Behav Res Methods. 2017 Aug;49(4):1374-1385. doi: 10.3758/s13428-016-0793-2.
9
Norms of valence and arousal for 14,031 Spanish words.14031个西班牙语单词的效价和唤醒规范。
Behav Res Methods. 2017 Feb;49(1):111-123. doi: 10.3758/s13428-015-0700-2.
10
Norms of valence, arousal, and dominance for 13,915 English lemmas.13915 个英语词汇的效价、唤醒度和支配度的常模。
Behav Res Methods. 2013 Dec;45(4):1191-207. doi: 10.3758/s13428-012-0314-x.