EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France.
College of the Holy Cross, Worcester, MA, United States of America.
PLoS One. 2024 Sep 25;19(9):e0309831. doi: 10.1371/journal.pone.0309831. eCollection 2024.
Conversations encompass continuous exchanges of verbal and nonverbal information. Previous research has demonstrated that gestures dynamically entrain each other and that speakers tend to align their vocal properties. While gesture and speech are known to synchronize at the intrapersonal level, few studies have investigated the multimodal dynamics of gesture/speech between individuals. The present study aims to extend our comprehension of unimodal dynamics of speech and gesture to multimodal speech/gesture dynamics. We used an online dataset of 14 dyads engaged in unstructured conversation. Speech and gesture synchronization was measured with cross-wavelets at different timescales. Results supported previous research on intrapersonal speech/gesture coordination, finding synchronization at all timescales of the conversation. Extending the literature, we also found interpersonal synchronization between speech and gesture. Given that the unimodal and multimodal synchronization occurred at similar timescales, we suggest that synchronization likely depends on the vocal channel, particularly on the turn-taking dynamics of the conversation.
对话包含持续的言语和非言语信息交流。先前的研究表明,手势是动态地相互协调的,说话者倾向于调整自己的声音特征。虽然手势和语音在个体内部已经被证明是同步的,但很少有研究调查个体之间的手势/语音的多模态动态。本研究旨在将我们对言语和手势的单模态动态的理解扩展到多模态的言语/手势动态。我们使用了一个由 14 对参与者进行非结构化对话的在线数据集。使用交叉小波在不同的时间尺度上测量语音和手势的同步。结果支持了关于个体内言语/手势协调的先前研究,发现对话的所有时间尺度都存在同步。扩展文献,我们还发现了言语和手势之间的人际同步。鉴于单模态和多模态的同步发生在相似的时间尺度上,我们认为同步可能取决于声音通道,特别是对话的轮次动态。