• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于循环神经网络的局部训练-剪枝方法。

A local training-pruning approach for recurrent neural networks.

作者信息

Leung Chi-Sing, Lam Ping-Man

机构信息

The City University of Hong Kong, Kowloon Tong, Hong Kong, China.

出版信息

Int J Neural Syst. 2003 Feb;13(1):25-38. doi: 10.1142/S0129065703001376.

DOI:10.1142/S0129065703001376
PMID:12638121
Abstract

The global extended Kalman filtering (EKF) algorithm for recurrent neural networks (RNNs) is plagued by the drawback of high computational cost and storage requirement. In this paper, we present a local EKF training-pruning approach that can solve this problem. In particular, the by-products, obtained along with the local EKF training, can be utilized to measure the importance of the network weights. Comparing with the original global approach, the proposed local approach results in much lower computational cost and storage requirement. Hence, it is more practical in solving real world problems. Simulation showed that our approach is an effective joint-training-pruning method for RNNs under online operation.

摘要

用于递归神经网络(RNN)的全局扩展卡尔曼滤波(EKF)算法存在计算成本高和存储需求大的缺点。在本文中,我们提出了一种局部EKF训练-剪枝方法来解决这个问题。具体而言,在局部EKF训练过程中获得的副产品可用于衡量网络权重的重要性。与原始的全局方法相比,所提出的局部方法具有更低的计算成本和存储需求。因此,它在解决实际问题中更具实用性。仿真表明,我们的方法是一种适用于在线运行的RNN的有效联合训练-剪枝方法。

相似文献

1
A local training-pruning approach for recurrent neural networks.一种用于循环神经网络的局部训练-剪枝方法。
Int J Neural Syst. 2003 Feb;13(1):25-38. doi: 10.1142/S0129065703001376.
2
A local training and pruning approach for neural networks.一种用于神经网络的局部训练与剪枝方法。
Int J Neural Syst. 2000 Dec;10(6):425-38. doi: 10.1142/S0129065700000430.
3
Dual extended Kalman filtering in recurrent neural networks(1).循环神经网络中的双扩展卡尔曼滤波(1)
Neural Netw. 2003 Mar;16(2):223-39. doi: 10.1016/s0893-6080(02)00230-7.
4
Multiobjective hybrid optimization and training of recurrent neural networks.递归神经网络的多目标混合优化与训练
IEEE Trans Syst Man Cybern B Cybern. 2008 Apr;38(2):381-403. doi: 10.1109/TSMCB.2007.912937.
5
Recursive Bayesian recurrent neural networks for time-series modeling.用于时间序列建模的递归贝叶斯递归神经网络。
IEEE Trans Neural Netw. 2010 Feb;21(2):262-74. doi: 10.1109/TNN.2009.2036174. Epub 2009 Dec 28.
6
Decision feedback recurrent neural equalization with fast convergence rate.具有快速收敛速率的判决反馈递归神经均衡器。
IEEE Trans Neural Netw. 2005 May;16(3):699-708. doi: 10.1109/TNN.2005.845142.
7
Memory-efficient fully coupled filtering approach for observational model building.用于观测模型构建的内存高效全耦合滤波方法。
IEEE Trans Neural Netw. 2010 Apr;21(4):680-6. doi: 10.1109/TNN.2010.2041067. Epub 2010 Feb 25.
8
An adaptive Bayesian pruning for neural networks in a non-stationary environment.非平稳环境下神经网络的自适应贝叶斯剪枝
Neural Comput. 1999 May 15;11(4):965-76. doi: 10.1162/089976699300016539.
9
Extended Kalman Filter-Based Pruning Method for Recurrent Neural Networks.基于扩展卡尔曼滤波器的递归神经网络剪枝方法
Neural Comput. 1998 Jul 28;10(6):1481-505. doi: 10.1162/089976698300017278.
10
Robust initialization of a Jordan network with recurrent constrained learning.基于循环约束学习的约旦网络稳健初始化
IEEE Trans Neural Netw. 2011 Dec;22(12):2460-73. doi: 10.1109/TNN.2011.2168423. Epub 2011 Sep 29.