Preisig Basil C, Eggenberger Noëmi, Zito Giuseppe, Vanbellingen Tim, Schumacher Rahel, Hopfner Simone, Nyffeler Thomas, Gutbrod Klemens, Annoni Jean-Marie, Bohlhalter Stephan, Müri René M
Perception and Eye Movement Laboratory, Departments of Neurology and Clinical Research, Inselspital, University Hospital Bern, and University of Bern, Switzerland.
ARTORG Center for Biomedical Engineering Research, University of Bern, Switzerland.
Cortex. 2015 Mar;64:157-68. doi: 10.1016/j.cortex.2014.10.013. Epub 2014 Nov 4.
Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies.
Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body.
Both aphasic patients and healthy controls mainly fixated the speaker's face. We found a significant co-speech gesture × ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction × ROI × group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker's face compared to healthy controls.
Co-speech gestures guide the observer's attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker's face.
伴随言语的手势是对话中非语言交流的一部分。它们要么支持言语信息,要么为对话者提供额外信息。此外,它们作为非语言线索促使轮流对话的合作过程。在本研究中,我们调查了伴随言语的手势对失语症患者二元对话感知的影响。特别是,我们分析了伴随言语的手势对注视方向(朝向说话者或倾听者)和身体部位注视的影响。我们假设,在言语理解方面受限的失语症患者会调整他们的视觉探索策略。
16名失语症患者和23名健康对照者参与了本研究。当受试者观看描绘两人之间自发对话的视频时,通过非接触式红外眼动仪测量视觉探索行为。计算了伴随言语的手势(存在和不存在)、注视方向(朝向说话者或倾听者)以及感兴趣区域(ROI),包括手、脸和身体等因素的累积注视持续时间和平均注视持续时间。
失语症患者和健康对照者主要注视说话者的脸。我们发现了显著的伴随言语的手势×ROI交互作用,表明伴随言语的手势的存在促使受试者看向说话者。此外,存在显著的注视方向×ROI×组间交互作用,表明与健康对照者相比,失语症患者对说话者脸部的累积注视持续时间减少。
伴随言语的手势将观察者的注意力引向语义输入源——说话者。讨论了潜在的语义处理缺陷或整合视听信息的缺陷是否可能导致失语症患者较少探索说话者的脸部。