Guha Tanaya, Yang Zhaojun, Ramakrishna Anil, Grossman Ruth B, Darren Hedley, Lee Sungbok, Narayanan Shrikanth S
Signal Analysis and Interpretation Lab, University of Southern California, Los Angeles, CA.
Emerson College, Boston, MA ; University of Massachusetts Medical School, Boston, MA.
Proc IEEE Int Conf Acoust Speech Signal Process. 2015 Apr;2015:803-807. doi: 10.1109/ICASSP.2015.7178080.
Children with Autism Spectrum Disorder (ASD) are known to have difficulty in producing and perceiving emotional facial expressions. Their expressions are often perceived as by adult observers. This paper focuses on data driven ways to analyze and quantify atypicality in facial expressions of children with ASD. Our objective is to uncover those characteristics of facial gestures that induce the sense of perceived atypicality in observers. Using a carefully collected motion capture database, facial expressions of children with and without ASD are compared within six basic emotion categories employing methods from information theory, time-series modeling and statistical analysis. Our experiments show that children with ASD usually have less complex expression producing mechanisms; the differences in facial dynamics between children with and without ASD primarily come from the eye region. Our study also notes that children with ASD exhibit lower symmetry between left and right regions, and lower variation in motion intensity across facial regions.
患有自闭症谱系障碍(ASD)的儿童在产生和感知情感面部表情方面存在困难。他们的表情往往被成年观察者视为异常。本文重点介绍了以数据驱动的方式来分析和量化自闭症谱系障碍儿童面部表情的异常性。我们的目标是揭示那些在观察者中引发异常感的面部手势特征。利用精心收集的动作捕捉数据库,采用信息论、时间序列建模和统计分析方法,对患有和未患有自闭症谱系障碍的儿童在六种基本情感类别中的面部表情进行比较。我们的实验表明,患有自闭症谱系障碍的儿童通常表情产生机制较不复杂;患有和未患有自闭症谱系障碍的儿童在面部动态方面的差异主要来自眼部区域。我们的研究还指出,患有自闭症谱系障碍的儿童左右区域之间的对称性较低,面部各区域运动强度的变化也较小。