Suppr超能文献

面部表情在自闭症谱系障碍患者的神经信号中被准确编码:一种深度学习方法。

Facial Emotions Are Accurately Encoded in the Neural Signal of Those With Autism Spectrum Disorder: A Deep Learning Approach.

作者信息

Mayor Torres Juan Manuel, Clarkson Tessa, Hauschild Kathryn M, Luhmann Christian C, Lerner Matthew D, Riccardi Giuseppe

机构信息

Department of Information Engineering and Computer Science, University of Trento, Povo Trento, Italy.

Department of Psychology, Temple University, Philadelphia, Pennsylvania.

出版信息

Biol Psychiatry Cogn Neurosci Neuroimaging. 2022 Jul;7(7):688-695. doi: 10.1016/j.bpsc.2021.03.015. Epub 2021 Apr 16.

Abstract

BACKGROUND

Individuals with autism spectrum disorder (ASD) exhibit frequent behavioral deficits in facial emotion recognition (FER). It remains unknown whether these deficits arise because facial emotion information is not encoded in their neural signal or because it is encodes but fails to translate to FER behavior (deployment). This distinction has functional implications, including constraining when differences in social information processing occur in ASD, and guiding interventions (i.e., developing prosthetic FER vs. reinforcing existing skills).

METHODS

We utilized a discriminative and contemporary machine learning approach-deep convolutional neural networks-to classify facial emotions viewed by individuals with and without ASD (N = 88) from concurrently recorded electroencephalography signals.

RESULTS

The convolutional neural network classified facial emotions with high accuracy for both ASD and non-ASD groups, even though individuals with ASD performed more poorly on the concurrent FER task. In fact, convolutional neural network accuracy was greater in the ASD group and was not related to behavioral performance. This pattern of results replicated across three independent participant samples. Moreover, feature importance analyses suggested that a late temporal window of neural activity (1000-1500 ms) may be uniquely important in facial emotion classification for individuals with ASD.

CONCLUSIONS

Our results reveal for the first time that facial emotion information is encoded in the neural signal of individuals with (and without) ASD. Thus, observed difficulties in behavioral FER associated with ASD likely arise from difficulties in decoding or deployment of facial emotion information within the neural signal. Interventions should focus on capitalizing on this intact encoding rather than promoting compensation or FER prostheses.

摘要

背景

自闭症谱系障碍(ASD)患者在面部情绪识别(FER)方面经常表现出行为缺陷。目前尚不清楚这些缺陷是由于面部情绪信息未在其神经信号中编码,还是因为它已被编码但未能转化为FER行为(应用)。这种区别具有功能意义,包括限制ASD中社会信息处理差异出现的时间,并指导干预措施(即开发假体FER与强化现有技能)。

方法

我们采用了一种有区分性的当代机器学习方法——深度卷积神经网络——根据同时记录的脑电图信号对患有和未患有ASD的个体(N = 88)观看的面部情绪进行分类。

结果

卷积神经网络对ASD组和非ASD组的面部情绪分类准确率都很高,尽管患有ASD的个体在同时进行的FER任务中表现较差。事实上,ASD组的卷积神经网络准确率更高,且与行为表现无关。这一结果模式在三个独立的参与者样本中得到了重复。此外,特征重要性分析表明,神经活动的晚期时间窗口(1000 - 1500毫秒)可能在ASD个体的面部情绪分类中具有独特的重要性。

结论

我们的结果首次揭示,面部情绪信息在患有(和未患有)ASD的个体的神经信号中被编码。因此,观察到的与ASD相关的行为FER困难可能源于神经信号中面部情绪信息的解码或应用困难。干预措施应侧重于利用这种完整的编码,而不是促进补偿或FER假体。

相似文献

引用本文的文献

6
Social Knowledge & Performance in Autism: A Critical Review & Recommendations.自闭症患者的社会知识与表现:批判性回顾与建议。
Clin Child Fam Psychol Rev. 2023 Sep;26(3):665-689. doi: 10.1007/s10567-023-00449-0. Epub 2023 Aug 6.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验