Ohshima Saori, Koeda Michihiko, Kawai Wakana, Saito Hikaru, Niioka Kiyomitsu, Okuno Koki, Naganawa Sho, Hama Tomoko, Kyutoku Yasushi, Dan Ippeita
Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
Department of Neuropsychiatry, Graduate School of Medicine, Nippon Medical School, Bunkyo, Japan.
Front Hum Neurosci. 2023 Dec 29;17:1160392. doi: 10.3389/fnhum.2023.1160392. eCollection 2023.
Humans mainly utilize visual and auditory information as a cue to infer others' emotions. Previous neuroimaging studies have shown the neural basis of memory processing based on facial expression, but few studies have examined it based on vocal cues. Thus, we aimed to investigate brain regions associated with emotional judgment based on vocal cues using an N-back task paradigm.
Thirty participants performed N-back tasks requiring them to judge emotion or gender from voices that contained both emotion and gender information. During these tasks, cerebral hemodynamic response was measured using functional near-infrared spectroscopy (fNIRS).
The results revealed that during the Emotion 2-back task there was significant activation in the frontal area, including the right precentral and inferior frontal gyri, possibly reflecting the function of an attentional network with auditory top-down processing. In addition, there was significant activation in the ventrolateral prefrontal cortex, which is known to be a major part of the working memory center.
These results suggest that, compared to judging the gender of voice stimuli, when judging emotional information, attention is directed more deeply and demands for higher-order cognition, including working memory, are greater. We have revealed for the first time the specific neural basis for emotional judgments based on vocal cues compared to that for gender judgments based on vocal cues.
人类主要利用视觉和听觉信息作为线索来推断他人的情绪。先前的神经影像学研究已经揭示了基于面部表情的记忆处理的神经基础,但很少有研究基于声音线索对此进行考察。因此,我们旨在使用N-back任务范式来研究与基于声音线索的情绪判断相关的脑区。
30名参与者执行N-back任务,要求他们从包含情绪和性别信息的声音中判断情绪或性别。在这些任务期间,使用功能近红外光谱技术(fNIRS)测量脑血流动力学反应。
结果显示,在情绪2-back任务期间,额叶区域有显著激活,包括右侧中央前回和额下回,这可能反映了具有听觉自上而下处理功能的注意网络的功能。此外,腹外侧前额叶皮层也有显著激活,该区域是工作记忆中心的主要组成部分。
这些结果表明,与判断声音刺激的性别相比,在判断情绪信息时,注意力的指向更深,对包括工作记忆在内的高阶认知的需求更大。与基于声音线索的性别判断相比,我们首次揭示了基于声音线索的情绪判断的特定神经基础。