Bálint András, Wimmer Wilhelm, Caversaccio Marco, Rummel Christian, Weder Stefan
Hearing Research Laboratory, ARTORG Center for Biomedical Engineering Research, University of Bern 3008 Bern, Switzerland; Department of ENT - Head and Neck Surgery, Inselspital, Bern University Hospital, University of Bern 3010 Bern, Switzerland.
Department of ENT - Head and Neck Surgery, Inselspital, Bern University Hospital, University of Bern 3010 Bern, Switzerland; Department of Otorhinolaryngology, Klinikum rechts der Isar, Technical University of Munich, Germany.
Hear Res. 2025 Jan;455:109155. doi: 10.1016/j.heares.2024.109155. Epub 2024 Nov 30.
Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.
Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.
In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.
We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.
了解大脑对听觉和视觉言语的处理对于推进言语感知研究以及改善听力障碍个体的临床干预至关重要。功能近红外光谱技术(fNIRS)被认为非常适合测量语言任务期间的大脑活动。然而,准确的数据解释还需要经过验证的刺激和行为测量方法。
26名听力正常的成年人听取了奥尔登堡句子测试(OLSA)中的句子,并通过fNIRS测量颞叶、枕叶和前额叶区域的大脑激活情况。句子以四种不同形式之一呈现:安静环境中的言语、噪声环境中的言语、视听言语或视觉言语(即唇读)。为了支持对我们fNIRS数据的解释,并更全面地了解研究人群,我们进行了听力测试(纯音和言语测听),并使用经过验证的问卷、任务中的理解问题和听力努力评分收集行为数据。
在听觉条件下(即安静环境中的言语和噪声环境中的言语),我们观察到双侧颞叶区域的皮质活动。在视觉言语条件下,我们测量到枕叶区域有显著激活。在视听条件之后,两个区域均观察到皮质激活。此外,我们为听力正常的个体在唇读过程中处理视觉线索的方式建立了一个基线,并且我们发现与安静条件相比,噪声条件下前额叶皮质的活动更高,这与更高的听力努力有关。
我们证明了一项受临床启发的视听言语理解任务在听力正常参与者中的适用性。所测量的大脑激活模式得到了客观和行为参数的支持与补充。