Axelrod Vadim, Rozier Camille, Malkinson Tal Seidel, Lehongre Katia, Adam Claude, Lambrecq Virginie, Navarro Vincent, Naccache Lionel
The Gonda Multidisciplinary Brain Research Center, Bar Ilan University, Ramat Gan, 52900, Israel.
Institut National de La Santé et de La Recherche Médicale Unité 1127, Centre National de La Recherche Scientifique Unité Mixte de Recherche (UMR) 7225, Université Pierre-et-Marie-Curie Univ Paris 06 UMR S 1127, Institut Du Cerveau et de La Moelle Épinière ICM, 75013, Paris, France.
Neuropsychologia. 2022 Jun 6;170:108228. doi: 10.1016/j.neuropsychologia.2022.108228. Epub 2022 Mar 28.
When we see someone's face, our brain usually effortlessly extracts a variety of information such as facial identity, expression, or gaze direction. While it is widely accepted that dedicated subsystems are responsible for different aspects of face processing, how these subsystems work together is not yet fully understood. To this extent, one of the most explored questions is whether and if so, to what extent facial expression processing interacts with other stages of facial processing. In the present study, we report a rare case of a patient for whom we were able to record multi-unit activity (MUA) in the proximity of the fusiform face area (FFA) while two out of four recorded multi-units were face-selective. In our experiment, the human subject was shown images of neutral and fearful faces as well as everyday objects and frightening images of natural disaster. We found that activity of both face-selective units was modulated by facial expression stimuli, starting at about 150 ms from stimulus onset. For both facial conditions we observed abrupt increase in firing rate with a simultaneous peak, suggesting that this activity and the modulation by facial expression stimuli likely reflected feed-forward processing. Interestingly, while in one multi-unit, the firing rate for fearful faces was higher than for neutral faces, in the other multi-units the polarity was reversed. Finally, modulation in the face-selective units was specific to emotional facial stimuli, but not to emotional stimuli in general. The present multi-unit results, albeit obtained only for several multi-units, nevertheless are potentially valuable for understanding mechanisms of facial processing in humans.
当我们看到某人的脸时,我们的大脑通常能毫不费力地提取各种信息,如面部身份、表情或注视方向。虽然人们普遍认为专门的子系统负责面部处理的不同方面,但这些子系统如何协同工作尚未完全了解。在这个层面上,最受探索的问题之一是面部表情处理是否以及在何种程度上与面部处理的其他阶段相互作用。在本研究中,我们报告了一个罕见病例,我们能够在梭状回面孔区(FFA)附近记录多单元活动(MUA),同时记录的四个多单元中有两个是面部选择性的。在我们的实验中,向人类受试者展示了中性和恐惧面孔的图像以及日常物品和自然灾害的可怕图像。我们发现,两个面部选择性单元的活动都受到面部表情刺激的调制,从刺激开始约150毫秒起。对于两种面部情况,我们都观察到放电率突然增加并同时达到峰值,这表明这种活动以及面部表情刺激的调制可能反映了前馈处理。有趣的是,在一个多单元中,恐惧面孔的放电率高于中性面孔,而在另一个多单元中,极性则相反。最后,面部选择性单元中的调制特定于情绪化面部刺激,而不是一般的情绪刺激。尽管本研究的多单元结果仅从几个多单元获得,但对于理解人类面部处理机制可能具有潜在价值。