• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

递归神经网络中与语言构成相呼应的层次结构的出现。

Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network.

机构信息

Graduate School of Informatics, Kyoto University, Engineering Building #10, Sakyo, Kyoto, 606-8501, Japan.

出版信息

Neural Netw. 2011 May;24(4):311-20. doi: 10.1016/j.neunet.2010.12.006. Epub 2011 Jan 12.

DOI:10.1016/j.neunet.2010.12.006
PMID:21273043
Abstract

We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities to recognize, generate, and correct sentences by self-organizing in a way that mirrors the hierarchical structure of sentences: characters grouped into words, and words into sentences. The model can control which sentence to generate depending on its initial states (generation phase) and the initial states can be calculated from the target sentence (recognition phase). In an experiment, we trained our model over a set of unannotated sentences from an artificial language, represented as sequences of characters. Once trained, the model could recognize and generate grammatical sentences, even if they were not learned. Moreover, we found that our model could correct a few substitution errors in a sentence, and the correction performance was improved by adding the errors to the training sentences in each training iteration with a certain probability. An analysis of the neural activations in our model revealed that the MTRNN had self-organized, reflecting the hierarchical linguistic structure by taking advantage of the differences in timescale among its neurons: in particular, neurons that change the fastest represented "characters", those that change more slowly, "words", and those that change the slowest, "sentences".

摘要

我们展示了一种多时间尺度递归神经网络(MTRNN)可以通过自我组织来获得识别、生成和纠正句子的能力,这种自我组织的方式反映了句子的层次结构:字符组成单词,单词组成句子。该模型可以根据其初始状态来控制要生成的句子(生成阶段),并且可以根据目标句子来计算初始状态(识别阶段)。在一项实验中,我们在一组来自人工语言的未注释句子上训练了我们的模型,这些句子表示为字符序列。一旦训练完成,该模型就可以识别和生成语法正确的句子,即使它们没有被学习过。此外,我们发现我们的模型可以纠正句子中的一些替换错误,并且通过以一定的概率将错误添加到每个训练迭代中的训练句子中,可以提高纠错性能。对我们模型中的神经活动的分析表明,MTRNN 已经自我组织,通过利用其神经元之间的时间尺度差异来反映层次化的语言结构:特别是,变化最快的神经元代表“字符”,变化较慢的神经元代表“单词”,而变化最慢的神经元代表“句子”。

相似文献

1
Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network.递归神经网络中与语言构成相呼应的层次结构的出现。
Neural Netw. 2011 May;24(4):311-20. doi: 10.1016/j.neunet.2010.12.006. Epub 2011 Jan 12.
2
Neural network processing of natural language: II. Towards a unified model of corticostriatal function in learning sentence comprehension and non-linguistic sequencing.自然语言的神经网络处理:II. 迈向学习句子理解和非语言序列中皮质纹状体功能的统一模型。
Brain Lang. 2009 May-Jun;109(2-3):80-92. doi: 10.1016/j.bandl.2008.08.002. Epub 2008 Oct 5.
3
Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures.在递归语言结构上进行训练前后,简单循环网络状态空间的组织情况。
Neural Netw. 2007 Mar;20(2):236-44. doi: 10.1016/j.neunet.2006.01.020. Epub 2006 May 9.
4
Insensitivity of the human sentence-processing system to hierarchical structure.人类句子处理系统对层次结构不敏感。
Psychol Sci. 2011 Jun;22(6):829-34. doi: 10.1177/0956797611409589. Epub 2011 May 17.
5
Activations of "motor" and other non-language structures during sentence comprehension.句子理解过程中“运动”及其他非语言结构的激活。
Brain Lang. 2004 May;89(2):290-9. doi: 10.1016/S0093-934X(03)00359-6.
6
Did you get the beat? Late proficient French-German learners extract strong-weak patterns in tonal but not in linguistic sequences.你掌握节奏了吗?晚期熟练的法德双语学习者在旋律而非语言序列中提取重轻模式。
Neuroimage. 2011 Jan 1;54(1):568-76. doi: 10.1016/j.neuroimage.2010.07.062. Epub 2010 Aug 6.
7
Causal Bayesian network for tagging syntactical structure of Croatian sentences.用于标记克罗地亚语句子句法结构的因果贝叶斯网络。
Coll Antropol. 2005 Dec;29(2):731-3.
8
Semantic integration processes at different levels of syntactic hierarchy during sentence comprehension: an ERP study.句法语义层次不同水平上的句子理解中的语义整合过程:一项 ERP 研究。
Neuropsychologia. 2010 May;48(6):1551-62. doi: 10.1016/j.neuropsychologia.2010.02.001. Epub 2010 Feb 6.
9
Large-scale neural network for sentence processing.用于句子处理的大规模神经网络。
Brain Lang. 2006 Jan;96(1):14-36. doi: 10.1016/j.bandl.2005.07.072. Epub 2005 Sep 15.
10
Syntactic unification operations are reflected in oscillatory dynamics during on-line sentence comprehension.句法统一操作反映在在线句子理解过程中的振荡动力学中。
J Cogn Neurosci. 2010 Jul;22(7):1333-47. doi: 10.1162/jocn.2009.21283.

引用本文的文献

1
Crossmodal Language Grounding in an Embodied Neurocognitive Model.具身神经认知模型中的跨模态语言基础
Front Neurorobot. 2020 Oct 14;14:52. doi: 10.3389/fnbot.2020.00052. eCollection 2020.
2
Representation Learning of Logic Words by an RNN: From Word Sequences to Robot Actions.循环神经网络对逻辑词的表示学习:从单词序列到机器人动作
Front Neurorobot. 2017 Dec 22;11:70. doi: 10.3389/fnbot.2017.00070. eCollection 2017.
3
Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human-Robot Interaction.用于人机交互的循环神经网络中语言与行为的动态整合
Front Neurorobot. 2016 Jul 15;10:5. doi: 10.3389/fnbot.2016.00005. eCollection 2016.