Suppr超能文献

你们合拍吗?二元对话中的人际协调。

Are You on My Wavelength? Interpersonal Coordination in Dyadic Conversations.

作者信息

Hale Joanna, Ward Jamie A, Buccheri Francesco, Oliver Dominic, Hamilton Antonia F de C

机构信息

1Institute of Cognitive Neuroscience, UCL, Alexandra House, 17 Queen Square, London, WC1N 3AZ UK.

2Computing Department, Goldsmiths University of London, London, UK.

出版信息

J Nonverbal Behav. 2020;44(1):63-83. doi: 10.1007/s10919-019-00320-3. Epub 2019 Oct 15.

Abstract

Conversation between two people involves subtle nonverbal coordination in addition to speech. However, the precise parameters and timing of this coordination remain unclear, which limits our ability to theorize about the neural and cognitive mechanisms of social coordination. In particular, it is unclear if conversation is dominated by synchronization (with no time lag), rapid and reactive mimicry (with lags under 1 s) or traditionally observed mimicry (with several seconds lag), each of which demands a different neural mechanism. Here we describe data from high-resolution motion capture of the head movements of pairs of participants (= 31 dyads) engaged in structured conversations. In a pre-registered analysis pathway, we calculated the wavelet coherence of head motion within dyads as a measure of their nonverbal coordination and report two novel results. First, low-frequency coherence (0.2-1.1 Hz) is consistent with traditional observations of mimicry, and modeling shows this behavior is generated by a mechanism with a constant 600 ms lag between leader and follower. This is in line with rapid reactive (rather than predictive or memory-driven) models of mimicry behavior, and could be implemented in mirror neuron systems. Second, we find an unexpected pattern of lower-than-chance coherence between participants, or hypo-coherence, at high frequencies (2.6-6.5 Hz). Exploratory analyses show that this systematic decoupling is driven by fast nodding from the listening member of the dyad, and may be a newly identified social signal. These results provide a step towards the quantification of real-world human behavior in high resolution and provide new insights into the mechanisms of social coordination.

摘要

两人之间的对话除了言语之外还涉及微妙的非语言协调。然而,这种协调的确切参数和时间尚不清楚,这限制了我们对社会协调的神经和认知机制进行理论化的能力。特别是,尚不清楚对话是由同步(无时间滞后)、快速反应性模仿(滞后时间在1秒以内)还是传统观察到的模仿(滞后几秒)主导,每一种都需要不同的神经机制。在这里,我们描述了参与结构化对话的参与者对(=31对)头部运动的高分辨率动作捕捉数据。在一个预先注册的分析路径中,我们计算了对偶组内头部运动的小波相干性,作为他们非语言协调的一种度量,并报告了两个新结果。首先,低频相干性(0.2 - 1.1赫兹)与传统的模仿观察结果一致,建模表明这种行为是由一种在领导者和跟随者之间有600毫秒恒定滞后的机制产生的。这与模仿行为的快速反应性(而非预测性或记忆驱动)模型一致,并且可以在镜像神经元系统中实现。其次,我们发现在高频(2.6 - 6.5赫兹)时,参与者之间存在一种低于机会水平的相干性模式,即低相干性。探索性分析表明,这种系统性的解耦是由对偶组中倾听成员的快速点头驱动的,并且可能是一种新发现的社会信号。这些结果朝着高分辨率量化现实世界中的人类行为迈出了一步,并为社会协调机制提供了新的见解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e87e/7054373/16d329ab6ce4/10919_2019_320_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验