Synthetic Reality Lab, Department of Computer Science, University of Central Florida, Orlando, FL 32816, USA.
Department of Computer Science, University of Maryland, College Park, MD 20742, USA.
Sensors (Basel). 2023 Jul 21;23(14):6572. doi: 10.3390/s23146572.
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with ASD is crucial for providing personalized interventions and support. Existing methods often rely on controlled lab environments, limiting their applicability to real-world scenarios. Hence, a framework that enables the recognition of affective states in children with ASD in uncontrolled settings is needed. This paper presents a framework for recognizing the affective state of children with ASD in an in-the-wild setting using heart rate (HR) information. More specifically, an algorithm is developed that can classify a participant's emotion as positive, negative, or neutral by analyzing the heart rate signal acquired from a smartwatch. The heart rate data are obtained in real time using a smartwatch application while the child learns to code a robot and interacts with an avatar. The avatar assists the child in developing communication skills and programming the robot. In this paper, we also present a semi-automated annotation technique based on facial expression recognition for the heart rate data. The HR signal is analyzed to extract features that capture the emotional state of the child. Additionally, in this paper, the performance of a raw HR-signal-based emotion classification algorithm is compared with a classification approach based on features extracted from HR signals using discrete wavelet transform (DWT). The experimental results demonstrate that the proposed method achieves comparable performance to state-of-the-art HR-based emotion recognition techniques, despite being conducted in an uncontrolled setting rather than a controlled lab environment. The framework presented in this paper contributes to the real-world affect analysis of children with ASD using HR information. By enabling emotion recognition in uncontrolled settings, this approach has the potential to improve the monitoring and understanding of the emotional well-being of children with ASD in their daily lives.
在现实环境中识别自闭症谱系障碍(ASD)儿童的情感状态具有挑战性,因为存在各种头部姿势、光照水平、遮挡以及缺乏野外场景中带有情感标注的数据集。了解 ASD 儿童的情绪状态对于提供个性化干预和支持至关重要。现有的方法通常依赖于受控的实验室环境,限制了它们在现实场景中的适用性。因此,需要一种能够在不受控制的环境中识别 ASD 儿童情感状态的框架。本文提出了一种使用心率(HR)信息识别 ASD 儿童情感状态的框架。更具体地说,开发了一种算法,通过分析从智能手表获取的心率信号,将参与者的情绪分类为积极、消极或中性。心率数据通过智能手表应用程序实时获取,同时孩子学习编码机器人并与虚拟形象交互。虚拟形象帮助孩子发展沟通技巧和编程机器人。在本文中,我们还提出了一种基于面部表情识别的半自动化心率数据注释技术。分析 HR 信号以提取捕获儿童情绪状态的特征。此外,在本文中,还比较了基于原始 HR 信号的情绪分类算法的性能与基于离散小波变换(DWT)从 HR 信号中提取特征的分类方法的性能。实验结果表明,尽管是在不受控制的环境中而不是在受控的实验室环境中进行的,但所提出的方法可实现与最先进的基于 HR 的情绪识别技术相当的性能。本文提出的框架为使用 HR 信息对 ASD 儿童的真实情感进行分析做出了贡献。通过在不受控制的环境中实现情绪识别,这种方法有可能改善对 ASD 儿童日常生活中情感福祉的监测和理解。