McCullough Stephen, Emmorey Karen, Sereno Martin
Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies 10010 North Torrey Pines Rd. La Jolla, CA 92037, USA.
Brain Res Cogn Brain Res. 2005 Feb;22(2):193-203. doi: 10.1016/j.cogbrainres.2004.08.012.
Recognition of emotional facial expressions is universal for all humans, but signed language users must also recognize certain non-affective facial expressions as linguistic markers. fMRI was used to investigate the neural systems underlying recognition of these functionally distinct expressions, comparing deaf ASL signers and hearing nonsigners. Within the superior temporal sulcus (STS), activation for emotional expressions was right lateralized for the hearing group and bilateral for the deaf group. In contrast, activation within STS for linguistic facial expressions was left lateralized only for signers and only when linguistic facial expressions co-occurred with verbs. Within the fusiform gyrus (FG), activation was left lateralized for ASL signers for both expression types, whereas activation was bilateral for both expression types for nonsigners. We propose that left lateralization in FG may be due to continuous analysis of local facial features during on-line sign language processing. The results indicate that function in part drives the lateralization of neural systems that process human facial expressions.
识别情绪化的面部表情对所有人来说都是普遍存在的,但使用手语的人还必须将某些非情感性的面部表情识别为语言标记。功能性磁共振成像(fMRI)被用于研究识别这些功能不同表情背后的神经系统,对比了失聪的美国手语(ASL)使用者和听力正常的非手语使用者。在颞上沟(STS)内,听力组对情绪化表情的激活在右侧偏侧化,而失聪组则是双侧激活。相比之下,只有手语使用者在语言面部表情与动词同时出现时,STS内对语言面部表情的激活才在左侧偏侧化。在梭状回(FG)内,两种表情类型对ASL手语使用者的激活都在左侧偏侧化,而非手语使用者对两种表情类型的激活都是双侧的。我们认为,FG中的左侧偏侧化可能是由于在线手语处理过程中对局部面部特征的持续分析。结果表明,功能在一定程度上驱动了处理人类面部表情的神经系统的偏侧化。