Suppr超能文献

基于面部选择性和运动敏感区域解码面部表情。

Decoding facial expressions based on face-selective and motion-sensitive areas.

作者信息

Liang Yin, Liu Baolin, Xu Junhai, Zhang Gaoyan, Li Xianglin, Wang Peiyuan, Wang Bin

机构信息

School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, 300350, People's Republic of China.

State Key Laboratory of Intelligent Technology and Systems, National Laboratory for Information Science and Technology, Tsinghua University, Beijing, 100084, People's Republic of China.

出版信息

Hum Brain Mapp. 2017 Jun;38(6):3113-3125. doi: 10.1002/hbm.23578. Epub 2017 Mar 27.

Abstract

Humans can easily recognize others' facial expressions. Among the brain substrates that enable this ability, considerable attention has been paid to face-selective areas; in contrast, whether motion-sensitive areas, which clearly exhibit sensitivity to facial movements, are involved in facial expression recognition remained unclear. The present functional magnetic resonance imaging (fMRI) study used multi-voxel pattern analysis (MVPA) to explore facial expression decoding in both face-selective and motion-sensitive areas. In a block design experiment, participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise) in images, videos, and eyes-obscured videos. Due to the use of multiple stimulus types, the impacts of facial motion and eye-related information on facial expression decoding were also examined. It was found that motion-sensitive areas showed significant responses to emotional expressions and that dynamic expressions could be successfully decoded in both face-selective and motion-sensitive areas. Compared with static stimuli, dynamic expressions elicited consistently higher neural responses and decoding performance in all regions. A significant decrease in both activation and decoding accuracy due to the absence of eye-related information was also observed. Overall, the findings showed that emotional expressions are represented in motion-sensitive areas in addition to conventional face-selective areas, suggesting that motion-sensitive regions may also effectively contribute to facial expression recognition. The results also suggested that facial motion and eye-related information played important roles by carrying considerable expression information that could facilitate facial expression recognition. Hum Brain Mapp 38:3113-3125, 2017. © 2017 Wiley Periodicals, Inc.

摘要

人类能够轻松识别他人的面部表情。在促成这种能力的大脑基质中,对面部选择性区域给予了相当多的关注;相比之下,对面部动作明显表现出敏感性的运动敏感区域是否参与面部表情识别仍不清楚。目前的功能磁共振成像(fMRI)研究使用多体素模式分析(MVPA)来探索面部选择性区域和运动敏感区域中的面部表情解码。在一个组块设计实验中,参与者观看了六种基本情绪(愤怒、厌恶、恐惧、喜悦、悲伤和惊讶)的面部表情,这些表情呈现于图像、视频以及遮挡眼睛的视频中。由于使用了多种刺激类型,还研究了面部动作和与眼睛相关的信息对面部表情解码的影响。研究发现,运动敏感区域对情绪表情表现出显著反应,并且动态表情能够在面部选择性区域和运动敏感区域中成功解码。与静态刺激相比,动态表情在所有区域均引发了持续更高的神经反应和解码性能。还观察到由于缺少与眼睛相关的信息,激活和解码准确性均显著下降。总体而言,研究结果表明,除了传统的面部选择性区域外,情绪表情也在运动敏感区域中得到表征,这表明运动敏感区域可能也有效地促进了面部表情识别。结果还表明,面部动作和与眼睛相关的信息通过携带大量有助于面部表情识别的表情信息发挥了重要作用。《人类大脑图谱》38:3113 - 3125,2017年。© 2017威利期刊公司。

相似文献

1
Decoding facial expressions based on face-selective and motion-sensitive areas.
Hum Brain Mapp. 2017 Jun;38(6):3113-3125. doi: 10.1002/hbm.23578. Epub 2017 Mar 27.
2
Dynamic and static facial expressions decoded from motion-sensitive areas in the macaque monkey.
J Neurosci. 2012 Nov 7;32(45):15952-62. doi: 10.1523/JNEUROSCI.1992-12.2012.
3
Neural decoding of visual stimuli varies with fluctuations in global network efficiency.
Hum Brain Mapp. 2017 Jun;38(6):3069-3080. doi: 10.1002/hbm.23574. Epub 2017 Mar 25.
4
Functional integration of the posterior superior temporal sulcus correlates with facial expression recognition.
Hum Brain Mapp. 2016 May;37(5):1930-40. doi: 10.1002/hbm.23145. Epub 2016 Feb 25.
5
Multivariate Pattern Classification of Facial Expressions Based on Large-Scale Functional Connectivity.
Front Hum Neurosci. 2018 Mar 19;12:94. doi: 10.3389/fnhum.2018.00094. eCollection 2018.
7
Neural responses to facial expressions support the role of the amygdala in processing threat.
Soc Cogn Affect Neurosci. 2014 Nov;9(11):1684-9. doi: 10.1093/scan/nst162. Epub 2013 Oct 4.
8
Dynamic stimuli demonstrate a categorical representation of facial expression in the amygdala.
Neuropsychologia. 2014 Apr;56(100):47-52. doi: 10.1016/j.neuropsychologia.2014.01.005. Epub 2014 Jan 18.
10
Cross-Subject Commonality of Emotion Representations in Dorsal Motion-Sensitive Areas.
Front Neurosci. 2020 Oct 14;14:567797. doi: 10.3389/fnins.2020.567797. eCollection 2020.

引用本文的文献

1
Cross-modal decoding of emotional expressions in fMRI-Cross-session and cross-sample replication.
Imaging Neurosci (Camb). 2024 Sep 23;2. doi: 10.1162/imag_a_00289. eCollection 2024.
3
More than labels: neural representations of emotion words are widely distributed across the brain.
Soc Cogn Affect Neurosci. 2024 Jul 19;19(1). doi: 10.1093/scan/nsae043.
5
Recognizing facial expressions of emotion amid noise: A dynamic advantage.
J Vis. 2024 Jan 2;24(1):7. doi: 10.1167/jov.24.1.7.
7
Decoding six basic emotions from brain functional connectivity patterns.
Sci China Life Sci. 2023 Apr;66(4):835-847. doi: 10.1007/s11427-022-2206-3. Epub 2022 Nov 11.
8
Face mediated human-robot interaction for remote medical examination.
Sci Rep. 2022 Jul 22;12(1):12592. doi: 10.1038/s41598-022-16643-z.
9
Cross-Subject Commonality of Emotion Representations in Dorsal Motion-Sensitive Areas.
Front Neurosci. 2020 Oct 14;14:567797. doi: 10.3389/fnins.2020.567797. eCollection 2020.

本文引用的文献

1
Investigating the brain basis of facial expression perception using multi-voxel pattern analysis.
Cortex. 2015 Aug;69:131-40. doi: 10.1016/j.cortex.2015.05.003. Epub 2015 May 14.
2
Face, eye, and body selective responses in fusiform gyrus and adjacent cortex: an intracranial EEG study.
Front Hum Neurosci. 2014 Aug 21;8:642. doi: 10.3389/fnhum.2014.00642. eCollection 2014.
3
Network Interactions Explain Sensitivity to Dynamic Faces in the Superior Temporal Sulcus.
Cereb Cortex. 2015 Sep;25(9):2876-82. doi: 10.1093/cercor/bhu083. Epub 2014 Apr 25.
4
Early sensitivity for eyes within faces: a new neuronal account of holistic and featural processing.
Neuroimage. 2014 Aug 15;97:81-94. doi: 10.1016/j.neuroimage.2014.04.042. Epub 2014 Apr 21.
5
Emotional expressions evoke a differential response in the fusiform face area.
Front Hum Neurosci. 2013 Oct 28;7:692. doi: 10.3389/fnhum.2013.00692. eCollection 2013.
6
Top-down control of visual responses to fear by the amygdala.
J Neurosci. 2013 Oct 30;33(44):17435-43. doi: 10.1523/JNEUROSCI.2992-13.2013.
7
Brain networks subserving the evaluation of static and dynamic facial expressions.
Cortex. 2013 Oct;49(9):2462-72. doi: 10.1016/j.cortex.2013.01.002. Epub 2013 Jan 17.
8
Perception of face and body expressions using electromyography, pupillometry and gaze measures.
Front Psychol. 2013 Feb 8;4:28. doi: 10.3389/fpsyg.2013.00028. eCollection 2013.
9
Multivoxel pattern analysis for FMRI data: a review.
Comput Math Methods Med. 2012;2012:961257. doi: 10.1155/2012/961257. Epub 2012 Dec 6.
10
Dynamic and static facial expressions decoded from motion-sensitive areas in the macaque monkey.
J Neurosci. 2012 Nov 7;32(45):15952-62. doi: 10.1523/JNEUROSCI.1992-12.2012.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验