• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

自然语言处理模型揭示了人类对话的神经动力学。

Natural language processing models reveal neural dynamics of human conversation.

作者信息

Cai Jing, Hadjinicolaou Alex E, Paulk Angelique C, Soper Daniel J, Xia Tian, Williams Ziv M, Cash Sydney S

机构信息

Department of Neurosurgery, Massachusetts General Hospital, Harvard Medical School, Boston, MA.

Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA.

出版信息

bioRxiv. 2024 Apr 18:2023.03.10.531095. doi: 10.1101/2023.03.10.531095.

DOI:10.1101/2023.03.10.531095
PMID:36945468
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10028965/
Abstract

Through conversation, humans relay complex information through the alternation of speech production and comprehension. The neural mechanisms that underlie these complementary processes or through which information is precisely conveyed by language, however, remain poorly understood. Here, we used pretrained deep learning natural language processing models in combination with intracranial neuronal recordings to discover neural signals that reliably reflect speech production, comprehension, and their transitions during natural conversation between individuals. Our findings indicate that neural activities that encoded linguistic information were broadly distributed throughout frontotemporal areas across multiple frequency bands. We also find that these activities were specific to the words and sentences being conveyed and that they were dependent on the word's specific context and order. Finally, we demonstrate that these neural patterns partially overlapped during language production and comprehension and that listener-speaker transitions were associated with specific, time-aligned changes in neural activity. Collectively, our findings reveal a dynamical organization of neural activities that subserve language production and comprehension during natural conversation and harness the use of deep learning models in understanding the neural mechanisms underlying human language.

摘要

通过对话,人类通过言语产生和理解的交替来传递复杂信息。然而,这些互补过程背后的神经机制,或者说信息通过语言精确传达的神经机制,仍然知之甚少。在这里,我们使用预训练的深度学习自然语言处理模型结合颅内神经元记录,来发现能可靠反映言语产生、理解以及个体之间自然对话中它们的转换的神经信号。我们的研究结果表明,编码语言信息的神经活动广泛分布在多个频段的额颞叶区域。我们还发现,这些活动特定于所传达的单词和句子,并且它们依赖于单词的特定上下文和顺序。最后,我们证明这些神经模式在语言产生和理解过程中部分重叠,并且听众 - 说话者的转换与神经活动中特定的、时间对齐的变化相关。总的来说,我们的研究结果揭示了一种神经活动的动态组织,它在自然对话中支持语言产生和理解,并利用深度学习模型来理解人类语言背后的神经机制。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/bc748ab8ed96/nihpp-2023.03.10.531095v2-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/ceeaf7116c3c/nihpp-2023.03.10.531095v2-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/a6a3f8ed31cd/nihpp-2023.03.10.531095v2-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/7133964c7a4b/nihpp-2023.03.10.531095v2-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/bc748ab8ed96/nihpp-2023.03.10.531095v2-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/ceeaf7116c3c/nihpp-2023.03.10.531095v2-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/a6a3f8ed31cd/nihpp-2023.03.10.531095v2-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/7133964c7a4b/nihpp-2023.03.10.531095v2-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2560/11042608/bc748ab8ed96/nihpp-2023.03.10.531095v2-f0004.jpg

相似文献

1
Natural language processing models reveal neural dynamics of human conversation.自然语言处理模型揭示了人类对话的神经动力学。
bioRxiv. 2024 Apr 18:2023.03.10.531095. doi: 10.1101/2023.03.10.531095.
2
Natural language processing models reveal neural dynamics of human conversation.自然语言处理模型揭示了人类对话的神经动力学。
Nat Commun. 2025 Apr 9;16(1):3376. doi: 10.1038/s41467-025-58620-w.
3
Linguistic processing of task-irrelevant speech at a cocktail party.鸡尾酒会上与任务无关的言语的语言处理。
Elife. 2021 May 4;10:e65096. doi: 10.7554/eLife.65096.
4
Neural Signatures of Hierarchical Linguistic Structures in Second Language Listening Comprehension.第二语言听力理解中层次语言结构的神经特征。
eNeuro. 2023 Jun 26;10(6). doi: 10.1523/ENEURO.0346-22.2023. Print 2023 Jun.
5
Deep Artificial Neural Networks Reveal a Distributed Cortical Network Encoding Propositional Sentence-Level Meaning.深度人工神经网络揭示命题句级意义的分布式皮层网络编码。
J Neurosci. 2021 May 5;41(18):4100-4119. doi: 10.1523/JNEUROSCI.1152-20.2021. Epub 2021 Mar 22.
6
Models and Approaches for Comprehension of Dysarthric Speech Using Natural Language Processing: Systematic Review.使用自然语言处理理解构音障碍语音的模型与方法:系统综述
JMIR Rehabil Assist Technol. 2023 Oct 27;10:e44489. doi: 10.2196/44489.
7
Dimensionality and Ramping: Signatures of Sentence Integration in the Dynamics of Brains and Deep Language Models.维度和渐变:大脑和深度语言模型动态中句子整合的特征。
J Neurosci. 2023 Jul 19;43(29):5350-5364. doi: 10.1523/JNEUROSCI.1163-22.2023. Epub 2023 May 22.
8
EEG-based speaker-listener neural coupling reflects speech-selective attentional mechanisms beyond the speech stimulus.基于脑电图的说话者-倾听者神经耦合反映了超越语音刺激的语音选择性注意机制。
Cereb Cortex. 2023 Nov 4;33(22):11080-11091. doi: 10.1093/cercor/bhad347.
9
Statistical learning beyond words in human neonates.人类新生儿超越语言的统计学习。
Elife. 2025 Feb 17;13:RP101802. doi: 10.7554/eLife.101802.
10
Deep-learning models reveal how context and listener attention shape electrophysiological correlates of speech-to-language transformation.深度学习模型揭示了语境和听众注意力如何塑造言语到语言转换的电生理相关性。
PLoS Comput Biol. 2024 Nov 11;20(11):e1012537. doi: 10.1371/journal.pcbi.1012537. eCollection 2024 Nov.

本文引用的文献

1
A hierarchy of linguistic predictions during natural language comprehension.自然语言理解过程中的语言预测层次。
Proc Natl Acad Sci U S A. 2022 Aug 9;119(32):e2201968119. doi: 10.1073/pnas.2201968119. Epub 2022 Aug 3.
2
An investigation across 45 languages and 12 language families reveals a universal language network.一项涵盖 45 种语言和 12 个语系的调查揭示了一个普遍的语言网络。
Nat Neurosci. 2022 Aug;25(8):1014-1019. doi: 10.1038/s41593-022-01114-5. Epub 2022 Jul 18.
3
Neural dynamics differentially encode phrases and sentences during spoken language comprehension.
在口语理解过程中,神经动力学对短语和句子进行了不同的编码。
PLoS Biol. 2022 Jul 14;20(7):e3001713. doi: 10.1371/journal.pbio.3001713. eCollection 2022 Jul.
4
Shared computational principles for language processing in humans and deep language models.人类和深度语言模型语言处理的共享计算原则。
Nat Neurosci. 2022 Mar;25(3):369-380. doi: 10.1038/s41593-022-01026-4. Epub 2022 Mar 7.
5
A speech planning network for interactive language use.用于交互式语言使用的语音规划网络。
Nature. 2022 Feb;602(7895):117-122. doi: 10.1038/s41586-021-04270-z. Epub 2022 Jan 5.
6
The neural architecture of language: Integrative modeling converges on predictive processing.语言的神经结构:综合建模趋向于预测处理。
Proc Natl Acad Sci U S A. 2021 Nov 9;118(45). doi: 10.1073/pnas.2105646118.
7
FAD-BERT: Improved prediction of FAD binding sites using pre-training of deep bidirectional transformers.FAD-BERT:使用深度双向转换器的预训练改进 FAD 结合位点预测。
Comput Biol Med. 2021 Apr;131:104258. doi: 10.1016/j.compbiomed.2021.104258. Epub 2021 Feb 8.
8
Lossy-Context Surprisal: An Information-Theoretic Model of Memory Effects in Sentence Processing.有损失语境惊讶:句子处理中记忆效应的一种信息论模型。
Cogn Sci. 2020 Mar;44(3):e12814. doi: 10.1111/cogs.12814.
9
Motor cortical control of vocal interaction in neotropical singing mice.新热带地区鸣禽鼠的发声交互的运动皮质控制
Science. 2019 Mar 1;363(6430):983-988. doi: 10.1126/science.aau9480.
10
Neural encoding and production of functional morphemes in the posterior temporal lobe.后颞叶中功能词素的神经编码和产生。
Nat Commun. 2018 May 14;9(1):1877. doi: 10.1038/s41467-018-04235-3.