Drimalla Hanna, Scheffer Tobias, Landwehr Niels, Baskow Irina, Roepke Stefan, Behnia Behnoush, Dziobek Isabel
1Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, Germany.
2Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, Germany.
NPJ Digit Med. 2020 Feb 28;3:25. doi: 10.1038/s41746-020-0227-5. eCollection 2020.
Social interaction deficits are evident in many psychiatric conditions and specifically in autism spectrum disorder (ASD), but hard to assess objectively. We present a digital tool to automatically quantify biomarkers of social interaction deficits: the simulated interaction task (SIT), which entails a standardized 7-min simulated dialog via video and the automated analysis of facial expressions, gaze behavior, and voice characteristics. In a study with 37 adults with ASD without intellectual disability and 43 healthy controls, we show the potential of the tool as a diagnostic instrument and for better description of ASD-associated social phenotypes. Using machine-learning tools, we detected individuals with ASD with an accuracy of 73%, sensitivity of 67%, and specificity of 79%, based on their facial expressions and vocal characteristics alone. Especially reduced social smiling and facial mimicry as well as a higher voice fundamental frequency and harmony-to-noise-ratio were characteristic for individuals with ASD. The time-effective and cost-effective computer-based analysis outperformed a majority vote and performed equal to clinical expert ratings.
社交互动缺陷在许多精神疾病中都很明显,尤其是在自闭症谱系障碍(ASD)中,但很难进行客观评估。我们提出了一种数字工具来自动量化社交互动缺陷的生物标志物:模拟互动任务(SIT),该任务通过视频进行标准化的7分钟模拟对话,并对面部表情、注视行为和语音特征进行自动分析。在一项针对37名无智力残疾的成年ASD患者和43名健康对照的研究中,我们展示了该工具作为诊断工具以及更好地描述与ASD相关的社会表型的潜力。使用机器学习工具,仅基于面部表情和声音特征,我们检测出ASD患者的准确率为73%,敏感性为67%,特异性为79%。特别是社交微笑和面部模仿减少,以及更高的语音基频和和谐与噪声比是ASD患者的特征。基于计算机的高效且经济的分析优于多数投票法,并且与临床专家评级相当。