• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

局部递归神经网络的在线学习算法

On-line learning algorithms for locally recurrent neural networks.

作者信息

Campolucci P, Uncini A, Piazza F, Rao B D

机构信息

Dipartimento di Elettronica ed Automatica, Università di Ancona, Ancona, Italy.

出版信息

IEEE Trans Neural Netw. 1999;10(2):253-71. doi: 10.1109/72.750549.

DOI:10.1109/72.750549
PMID:18252525
Abstract

This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasis on multilayer perceptron (MLP) with infinite impulse response (IIR) synapses and its variations which include generalized output and activation feedback multilayer networks (MLN's). We propose a new gradient-based procedure called recursive backpropagation (RBP) whose on-line version, causal recursive backpropagation (CRBP), presents some advantages with respect to the other on-line training methods. The new CRBP algorithm includes as particular cases backpropagation (BP), temporal backpropagation (TBP), backpropagation for sequences (BPS), Back-Tsoi algorithm among others, thereby providing a unifying view on gradient calculation techniques for recurrent networks with local feedback. The only learning method that has been proposed for locally recurrent networks with no architectural restriction is the one by Back and Tsoi. The proposed algorithm has better stability and higher speed of convergence with respect to the Back-Tsoi algorithm, which is supported by the theoretical development and confirmed by simulations. The computational complexity of the CRBP is comparable with that of the Back-Tsoi algorithm, e.g., less that a factor of 1.5 for usual architectures and parameter settings. The superior performance of the new algorithm, however, easily justifies this small increase in computational burden. In addition, the general paradigms of truncated BPTT and RTRL are applied to networks with local feedback and compared with the new CRBP method. The simulations show that CRBP exhibits similar performances and the detailed analysis of complexity reveals that CRBP is much simpler and easier to implement, e.g., CRBP is local in space and in time while RTRL is not local in space.

摘要

本文聚焦于局部递归神经网络的在线学习过程,重点关注具有无限脉冲响应(IIR)突触的多层感知器(MLP)及其变体,包括广义输出和激活反馈多层网络(MLN)。我们提出了一种名为递归反向传播(RBP)的基于梯度的新过程,其在线版本,即因果递归反向传播(CRBP),相对于其他在线训练方法具有一些优势。新的CRBP算法包括反向传播(BP)、时间反向传播(TBP)、序列反向传播(BPS)、Back-Tsoi算法等特殊情况,从而为具有局部反馈的递归网络的梯度计算技术提供了统一的视角。针对无架构限制的局部递归网络,唯一被提出的学习方法是Back和Tsoi提出的方法。相对于Back-Tsoi算法,所提出的算法具有更好的稳定性和更高的收敛速度,这得到了理论发展的支持并通过仿真得到证实。CRBP的计算复杂度与Back-Tsoi算法相当,例如,对于通常的架构和参数设置,小于1.5倍。然而,新算法的优越性能很容易证明计算负担的这一小幅增加是合理的。此外,将截断BPTT和RTRL的一般范式应用于具有局部反馈的网络,并与新的CRBP方法进行比较。仿真表明CRBP表现出相似的性能,对复杂度的详细分析表明CRBP要简单得多且易于实现,例如,CRBP在空间和时间上是局部的,而RTRL在空间上不是局部的。

相似文献

1
On-line learning algorithms for locally recurrent neural networks.局部递归神经网络的在线学习算法
IEEE Trans Neural Netw. 1999;10(2):253-71. doi: 10.1109/72.750549.
2
Robust adaptive gradient-descent training algorithm for recurrent neural networks in discrete time domain.离散时域递归神经网络的鲁棒自适应梯度下降训练算法
IEEE Trans Neural Netw. 2008 Nov;19(11):1841-53. doi: 10.1109/TNN.2008.2001923.
3
New learning automata based algorithms for adaptation of backpropagation algorithm parameters.基于新型学习自动机的反向传播算法参数自适应算法
Int J Neural Syst. 2002 Feb;12(1):45-67. doi: 10.1142/S012906570200090X.
4
Novel maximum-margin training algorithms for supervised neural networks.用于监督神经网络的新型最大间隔训练算法。
IEEE Trans Neural Netw. 2010 Jun;21(6):972-84. doi: 10.1109/TNN.2010.2046423. Epub 2010 Apr 19.
5
Parameter incremental learning algorithm for neural networks.神经网络的参数增量学习算法
IEEE Trans Neural Netw. 2006 Nov;17(6):1424-38. doi: 10.1109/TNN.2006.880581.
6
Comments on "Backpropagation algorithms for a broad class of dynamic networks".对《一类广泛动态网络的反向传播算法》的评论
IEEE Trans Neural Netw. 2009 Mar;20(3):540-1. doi: 10.1109/TNN.2009.2013243. Epub 2009 Feb 10.
7
Backpropagation algorithms for a broad class of dynamic networks.适用于一大类动态网络的反向传播算法。
IEEE Trans Neural Netw. 2007 Jan;18(1):14-27. doi: 10.1109/TNN.2006.882371.
8
Gradient calculations for dynamic recurrent neural networks: a survey.动态递归神经网络的梯度计算:一项综述。
IEEE Trans Neural Netw. 1995;6(5):1212-28. doi: 10.1109/72.410363.
9
An accelerated learning algorithm for multilayer perceptron networks.一种用于多层感知器网络的加速学习算法。
IEEE Trans Neural Netw. 1994;5(3):493-7. doi: 10.1109/72.286921.
10
On adaptive learning rate that guarantees convergence in feedforward networks.关于保证前馈网络收敛的自适应学习率。
IEEE Trans Neural Netw. 2006 Sep;17(5):1116-25. doi: 10.1109/TNN.2006.878121.

引用本文的文献

1
Development of a Neural Network for Target Gas Detection in Interdigitated Electrode Sensor-Based E-Nose Systems.基于叉指电极传感器的电子鼻系统中用于目标气体检测的神经网络的开发。
Sensors (Basel). 2024 Aug 16;24(16):5315. doi: 10.3390/s24165315.
2
Artificial neural network model of the mapping between electromyographic activation and trajectory patterns in free-arm movements.自由手臂运动中肌电激活与轨迹模式之间映射的人工神经网络模型。
Med Biol Eng Comput. 2003 Mar;41(2):124-32. doi: 10.1007/BF02344879.