Liu Ping, Zhang Yi, Xiong Ziyue, Gao Ying
School of Business, Sichuan University, Chengdu, China.
Front Psychol. 2024 May 3;15:1302253. doi: 10.3389/fpsyg.2024.1302253. eCollection 2024.
To research the emotional interaction between customers and service staff, single-modal stimuli are being used to activate subjects' emotions while multimodal emotion stimuli with better efficiency are often neglected. This study aims to construct a multimodal emotion stimuli database (CCSIAS) with video records of real work status of 29 service staff and audio clips of interactions between customers and service staff by setting up wide-angle cameras and searching in company's Ocean Engine for 15 consecutive days. First, we developed a tool to assess the emotional statuses of customers and service staff in Study 1. Second, 40 Masters and PhD students were invited to assess the audio and video data to evaluate the emotional states of customers and service staff in Study 2, using the tools developed in Study 1. Third, 118 participants were recruited to test the results from Study 2 to ensure the stability of the derived data. The results showed that 139 sets of stable emotional audio & video data were constructed (26 sets were high, 59 sets were medium and 54 sets were low). The amount of emotional information is important for the effective activation of participants' emotional states, and the degree of emotional activation of video data is significantly higher than that of the audio data. Overall, it was shown that the study of emotional interaction phenomena requires a multimodal dataset. The CCSIAS (https://osf.io/muc86/) can extend the depth and breadth of emotional interaction research and can be applied to different emotional states between customers and service staff activation in the fields of organizational behavior and psychology.
为了研究顾客与服务人员之间的情感互动,目前使用的是单模态刺激来激活受试者的情绪,而效率更高的多模态情感刺激却常常被忽视。本研究旨在通过设置广角摄像头并在公司的巨量引擎中连续搜索15天,构建一个多模态情感刺激数据库(CCSIAS),该数据库包含29名服务人员的实际工作状态视频记录以及顾客与服务人员之间的互动音频片段。首先,我们在研究1中开发了一种工具来评估顾客和服务人员的情绪状态。其次,在研究2中,邀请了40名硕士和博士生使用研究1中开发的工具来评估音频和视频数据,以评估顾客和服务人员的情绪状态。第三,招募了118名参与者来测试研究2的结果,以确保所获数据的稳定性。结果表明,构建了139组稳定的情感音频和视频数据(26组为高强度,59组为中等强度,54组为低强度)。情感信息量对于有效激活参与者的情绪状态很重要,并且视频数据的情绪激活程度明显高于音频数据。总体而言,研究表明情感互动现象的研究需要多模态数据集。CCSIAS(https://osf.io/muc86/)可以扩展情感互动研究的深度和广度,并可应用于组织行为学和心理学领域中激活顾客与服务人员之间不同的情绪状态。