• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用Sigmoid单元的离散时间递归神经网络中有限状态机的稳定编码。

Stable encoding of finite-state machines in discrete-time recurrent neural nets with sigmoid units.

作者信息

Carrasco R C, Forcada M L, Valdés-Muñoz M A, Neco R P

机构信息

Departament de Llenguatges i Sistemes Informàtics, Universitat d'Alacant, E-03071 Alacant, Spain.

出版信息

Neural Comput. 2000 Sep;12(9):2129-74. doi: 10.1162/089976600300015097.

DOI:10.1162/089976600300015097
PMID:10976142
Abstract

There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn finite-state tasks, with interesting results regarding the induction of simple finite-state machines from input-output strings. Parallel work has studied the computational power of DTRNN in connection with finite-state computation. This article describes a simple strategy to devise stable encodings of finite-state machines in computationally capable discrete-time recurrent neural architectures with sigmoid units and gives a detailed presentation on how this strategy may be applied to encode a general class of finite-state machines in a variety of commonly used first- and second-order recurrent neural networks. Unlike previous work that either imposed some restrictions to state values or used a detailed analysis based on fixed-point attractors, our approach applies to any positive, bounded, strictly growing, continuous activation function and uses simple bounding criteria based on a study of the conditions under which a proposed encoding scheme guarantees that the DTRNN is actually behaving as a finite-state machine.

摘要

人们对使用离散时间递归神经网络(DTRNN)来学习有限状态任务产生了浓厚兴趣,在从输入 - 输出字符串中归纳简单有限状态机方面取得了有趣的成果。并行研究探讨了DTRNN在有限状态计算方面的计算能力。本文描述了一种简单策略,用于在具有Sigmoid单元的具备计算能力的离散时间递归神经架构中设计有限状态机的稳定编码,并详细介绍了如何将该策略应用于在各种常用的一阶和二阶递归神经网络中编码一类通用的有限状态机。与之前要么对状态值施加一些限制,要么基于定点吸引子进行详细分析的工作不同,我们的方法适用于任何正的、有界的、严格递增的连续激活函数,并基于对所提出的编码方案保证DTRNN实际表现为有限状态机的条件的研究,使用简单的边界准则。

相似文献

1
Stable encoding of finite-state machines in discrete-time recurrent neural nets with sigmoid units.使用Sigmoid单元的离散时间递归神经网络中有限状态机的稳定编码。
Neural Comput. 2000 Sep;12(9):2129-74. doi: 10.1162/089976600300015097.
2
Discontinuities in recurrent neural networks.循环神经网络中的不连续性。
Neural Comput. 1999 Apr 1;11(3):715-46. doi: 10.1162/089976699300016638.
3
Real-time computing without stable states: a new framework for neural computation based on perturbations.无稳定状态的实时计算:基于扰动的神经计算新框架。
Neural Comput. 2002 Nov;14(11):2531-60. doi: 10.1162/089976602760407955.
4
General-purpose computation with neural networks: a survey of complexity theoretic results.神经网络的通用计算:复杂性理论结果综述
Neural Comput. 2003 Dec;15(12):2727-78. doi: 10.1162/089976603322518731.
5
The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.离散时间计算的动力学,及其在递归神经网络和有限状态机提取中的应用。
Neural Comput. 1996 Aug 15;8(6):1135-78. doi: 10.1162/neco.1996.8.6.1135.
6
Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma-pi nets.用于基于连续输入随机存取存储器的sigma-pi网络的反向传播学习机制和高精度函数映射的部分预计算权重。
Neural Netw. 2000 Jan;13(1):91-110. doi: 10.1016/s0893-6080(99)00102-1.
7
Attractive periodic sets in discrete-time recurrent networks (with emphasis on fixed-point stability and bifurcations in two-neuron networks).离散时间递归网络中的吸引性周期集(重点关注双神经元网络中的定点稳定性和分岔)。
Neural Comput. 2001 Jun;13(6):1379-414. doi: 10.1162/08997660152002898.
8
Dynamic On-line Clustering and State Extraction: An Approach to Symbolic Learning.动态在线聚类与状态提取:一种符号学习方法。
Neural Netw. 1998 Jan;11(1):53-64. doi: 10.1016/s0893-6080(97)00113-5.
9
Time-delay neural networks: representation and induction of finite-state machines.时延神经网络:有限状态机的表示与归纳
IEEE Trans Neural Netw. 1997;8(5):1065-70. doi: 10.1109/72.623208.
10
Continuous-time symmetric Hopfield nets are computationally universal.连续时间对称霍普菲尔德网络具有计算通用性。
Neural Comput. 2003 Mar;15(3):693-733. doi: 10.1162/089976603321192130.