• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

简单循环网络通过计数来学习上下文无关语言和上下文敏感语言。

Simple recurrent networks learn context-free and context-sensitive languages by counting.

作者信息

Rodriguez P

机构信息

Department of Cognitive Science, University of California at San Diego, La Jolla, CA 92093, USA.

出版信息

Neural Comput. 2001 Sep;13(9):2093-118. doi: 10.1162/089976601750399326.

DOI:10.1162/089976601750399326
PMID:11516359
Abstract

It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context-free language (CFL) by counting up and down. This article extends that to show a range of language tasks in which an SRN develops solutions that not only count but also copy and store counting information. In one case, the network stores information like an explicit storage mechanism. In other cases, the network stores information more indirectly in trajectories that are sensitive to slight displacements that depend on context. In this sense, an SRN can learn analog computation as a set of interdependent counters. This demonstrates how SRNs may be an alternative psychological model of language or sequence processing.

摘要

研究表明,如果递归神经网络(RNN)学习处理一种正则语言,那么通过将相空间区域视为有限状态机(FSM)状态,就可以提取出一个有限状态机。然而,也有研究表明,可以通过将RNN动态用作计数器来构建一个RNN以实现图灵机。但是,网络如何学习需要计数的语言呢?罗德里格斯、怀尔斯和埃尔曼(1999年)表明,一个简单递归网络(SRN)可以通过向上和向下计数来学习处理一种简单的上下文无关语言(CFL)。本文对此进行了扩展,展示了一系列语言任务,在这些任务中,一个SRN开发出的解决方案不仅能够计数,还能复制和存储计数信息。在一种情况下,网络像显式存储机制一样存储信息。在其他情况下,网络将信息更间接地存储在对依赖于上下文的微小位移敏感的轨迹中。从这个意义上说,一个SRN可以将模拟计算学习为一组相互依赖的计数器。这证明了SRN如何可能成为语言或序列处理的一种替代心理模型。

相似文献

1
Simple recurrent networks learn context-free and context-sensitive languages by counting.简单循环网络通过计数来学习上下文无关语言和上下文敏感语言。
Neural Comput. 2001 Sep;13(9):2093-118. doi: 10.1162/089976601750399326.
2
Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures.在递归语言结构上进行训练前后,简单循环网络状态空间的组织情况。
Neural Netw. 2007 Mar;20(2):236-44. doi: 10.1016/j.neunet.2006.01.020. Epub 2006 May 9.
3
Incremental training of first order recurrent neural networks to predict a context-sensitive language.用于预测上下文敏感语言的一阶递归神经网络的增量训练。
Neural Netw. 2003 Sep;16(7):955-72. doi: 10.1016/S0893-6080(03)00054-6.
4
Learning grammatical structure with Echo State Networks.使用回声状态网络学习语法结构。
Neural Netw. 2007 Apr;20(3):424-32. doi: 10.1016/j.neunet.2007.04.013. Epub 2007 May 3.
5
Real-time computing without stable states: a new framework for neural computation based on perturbations.无稳定状态的实时计算:基于扰动的神经计算新框架。
Neural Comput. 2002 Nov;14(11):2531-60. doi: 10.1162/089976602760407955.
6
On the emergence of rules in neural networks.论神经网络中规则的出现。
Neural Comput. 2002 Sep;14(9):2245-68. doi: 10.1162/089976602320264079.
7
Elman topology with sigma-pi units: an application to the modeling of verbal hallucinations in schizophrenia.具有西格玛-派单元的埃尔曼拓扑结构:在精神分裂症言语幻觉建模中的应用。
Neural Netw. 2005 Sep;18(7):863-77. doi: 10.1016/j.neunet.2005.03.009.
8
Elman backpropagation as reinforcement for simple recurrent networks.作为简单循环网络强化的埃尔曼反向传播算法。
Neural Comput. 2007 Nov;19(11):3108-31. doi: 10.1162/neco.2007.19.11.3108.
9
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
10
Learning nonregular languages: a comparison of simple recurrent networks and LSTM.学习非正则语言:简单循环网络与长短期记忆网络的比较
Neural Comput. 2002 Sep;14(9):2039-41. doi: 10.1162/089976602320263980.

引用本文的文献

1
Reservoir computing with random and optimized time-shifts.具有随机和优化时移的储层计算。
Chaos. 2021 Dec;31(12):121103. doi: 10.1063/5.0068941.
2
Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning.比较前馈神经网络和递归神经网络架构与人类在人工语法学习中的行为。
Sci Rep. 2020 Dec 17;10(1):22172. doi: 10.1038/s41598-020-79127-y.
3
A neural network model for the orbitofrontal cortex and task space acquisition during reinforcement learning.一个用于强化学习期间眶额皮质和任务空间获取的神经网络模型。
PLoS Comput Biol. 2018 Jan 4;14(1):e1005925. doi: 10.1371/journal.pcbi.1005925. eCollection 2018 Jan.
4
Principles of structure building in music, language and animal song.音乐、语言和动物歌声中的结构构建原则。
Philos Trans R Soc Lond B Biol Sci. 2015 Mar 19;370(1664):20140097. doi: 10.1098/rstb.2014.0097.
5
Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging.语言特定区域的句法计算原则:整合理论语言学与功能性神经成像
Front Behav Neurosci. 2013 Dec 18;7:204. doi: 10.3389/fnbeh.2013.00204. eCollection 2013.
6
Lexical knowledge without a lexicon?没有词典的词汇知识?
Ment Lex. 2011;6(1):1-33. doi: 10.1075/ml.6.1.01elm.
7
Direct Associations or Internal Transformations? Exploring the Mechanisms Underlying Sequential Learning Behavior.直接关联还是内部转变?探索序列学习行为背后的机制。
Cogn Sci. 2010;34(1):10-50. doi: 10.1111/j.1551-6709.2009.01076.x.
8
A dynamical systems perspective on the relationship between symbolic and non-symbolic computation.符号计算与非符号计算关系的动力系统视角。
Cogn Neurodyn. 2009 Dec;3(4):415-27. doi: 10.1007/s11571-009-9099-8. Epub 2009 Nov 7.
9
On the meaning of words and dinosaur bones: Lexical knowledge without a lexicon.论词汇与恐龙骨骼的意义:没有词汇表的词汇知识
Cogn Sci. 2009;33(4):547-582. doi: 10.1111/j.1551-6709.2009.01023.x.