Rodger Helen, Vizioli Luca, Ouyang Xinyi, Caldara Roberto
Department of Psychology, University of Fribourg, Switzerland.
Dev Sci. 2015 Nov;18(6):926-39. doi: 10.1111/desc.12281. Epub 2015 Feb 20.
Reading the non-verbal cues from faces to infer the emotional states of others is central to our daily social interactions from very early in life. Despite the relatively well-documented ontogeny of facial expression recognition in infancy, our understanding of the development of this critical social skill throughout childhood into adulthood remains limited. To this end, using a psychophysical approach we implemented the QUEST threshold-seeking algorithm to parametrically manipulate the quantity of signals available in faces normalized for contrast and luminance displaying the six emotional expressions, plus neutral. We thus determined observers' perceptual thresholds for effective discrimination of each emotional expression from 5 years of age up to adulthood. Consistent with previous studies, happiness was most easily recognized with minimum signals (35% on average), whereas fear required the maximum signals (97% on average) across groups. Overall, recognition improved with age for all expressions except happiness and fear, for which all age groups including the youngest remained within the adult range. Uniquely, our findings characterize the recognition trajectories of the six basic emotions into three distinct groupings: expressions that show a steep improvement with age - disgust, neutral, and anger; expressions that show a more gradual improvement with age - sadness, surprise; and those that remain stable from early childhood - happiness and fear, indicating that the coding for these expressions is already mature by 5 years of age. Altogether, our data provide for the first time a fine-grained mapping of the development of facial expression recognition. This approach significantly increases our understanding of the decoding of emotions across development and offers a novel tool to measure impairments for specific facial expressions in developmental clinical populations.
从面部读取非语言线索以推断他人的情绪状态是我们从生命早期就开始的日常社交互动的核心。尽管婴儿期面部表情识别的个体发育已有相对充分的记录,但我们对这一关键社交技能在整个儿童期到成年期的发展的理解仍然有限。为此,我们采用心理物理学方法,实施了QUEST阈值搜索算法,以参数方式操纵面部中经过对比度和亮度归一化处理的六种情绪表情(加上中性表情)的可用信号量。我们据此确定了从5岁到成年的观察者有效区分每种情绪表情的感知阈值。与先前的研究一致,幸福表情在信号量最少时(平均35%)最容易被识别,而恐惧表情在所有组中需要的信号量最多(平均97%)。总体而言,除幸福和恐惧表情外,所有表情的识别能力都随年龄增长而提高,包括最年幼组在内的所有年龄组对幸福和恐惧表情的识别能力都保持在成年人范围内。独特的是,我们的研究结果将六种基本情绪的识别轨迹分为三个不同的组:随年龄增长有显著改善的表情——厌恶、中性和愤怒;随年龄增长有更渐进改善的表情——悲伤、惊讶;以及从幼儿期就保持稳定的表情——幸福和恐惧,这表明这些表情的编码在5岁时就已经成熟。总之,我们的数据首次提供了面部表情识别发展的精细图谱。这种方法显著增进了我们对不同发育阶段情绪解码的理解,并为测量发育临床人群中特定面部表情的损伤提供了一种新工具。