• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

DRRNets:循环神经网络中基于低秩正则化的动态循环路由

DRRNets: Dynamic Recurrent Routing via Low-Rank Regularization in Recurrent Neural Networks.

作者信息

Shan Dongjing, Luo Yong, Zhang Xiongwei, Zhang Chao

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):2057-2067. doi: 10.1109/TNNLS.2021.3105818. Epub 2023 Apr 4.

DOI:10.1109/TNNLS.2021.3105818
PMID:34460403
Abstract

Recurrent neural networks (RNNs) continue to show outstanding performance in sequence learning tasks such as language modeling, but it remains difficult to train RNNs for long sequences. The main challenges lie in the complex dependencies, gradient vanishing or exploding, and low resource requirement in model deployment. In order to address these challenges, we propose dynamic recurrent routing neural networks (DRRNets), which can: 1) shorten the recurrent lengths by allocating recurrent routes dynamically for different dependencies and 2) reduce the number of parameters significantly by imposing low-rank constraints on the fully connected layers. A novel optimization algorithm via low-rank constraint and sparsity projection is developed to train the network. We verify the effectiveness of the proposed method by comparing it with multiple competitive approaches in several popular sequential learning tasks, such as language modeling and speaker recognition. The results in terms of different criteria demonstrate the superiority of our proposed method.

摘要

循环神经网络(RNN)在诸如语言建模等序列学习任务中持续展现出卓越的性能,但训练长序列的RNN仍然困难重重。主要挑战在于复杂的依赖关系、梯度消失或爆炸以及模型部署中的低资源需求。为了应对这些挑战,我们提出了动态循环路由神经网络(DRRNets),它能够:1)通过为不同的依赖关系动态分配循环路由来缩短循环长度;2)通过对全连接层施加低秩约束来显著减少参数数量。我们开发了一种通过低秩约束和稀疏投影的新型优化算法来训练网络。通过在几个流行的序列学习任务(如语言建模和说话人识别)中与多种竞争方法进行比较,我们验证了所提方法的有效性。根据不同标准得出的结果证明了我们所提方法的优越性。

相似文献

1
DRRNets: Dynamic Recurrent Routing via Low-Rank Regularization in Recurrent Neural Networks.DRRNets:循环神经网络中基于低秩正则化的动态循环路由
IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):2057-2067. doi: 10.1109/TNNLS.2021.3105818. Epub 2023 Apr 4.
2
Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks.基于递归神经网络的自动调制分类的深度稀疏学习。
Sensors (Basel). 2021 Sep 25;21(19):6410. doi: 10.3390/s21196410.
3
SGORNN: Combining scalar gates and orthogonal constraints in recurrent networks.SGORNN:在循环神经网络中结合标量门和正交约束。
Neural Netw. 2023 Feb;159:25-33. doi: 10.1016/j.neunet.2022.11.028. Epub 2022 Nov 25.
4
Gating Revisited: Deep Multi-Layer RNNs That can be Trained.门控重新审视:可训练的深度多层 RNN
IEEE Trans Pattern Anal Mach Intell. 2022 Aug;44(8):4081-4092. doi: 10.1109/TPAMI.2021.3064878. Epub 2022 Jul 1.
5
Gated Orthogonal Recurrent Units: On Learning to Forget.门控正交循环单元:关于学习遗忘
Neural Comput. 2019 Apr;31(4):765-783. doi: 10.1162/neco_a_01174. Epub 2019 Feb 14.
6
Unconditional stability of a recurrent neural circuit implementing divisive normalization.实现分裂归一化的循环神经回路的无条件稳定性。
ArXiv. 2025 Jan 15:arXiv:2409.18946v3.
7
Low-Rank Deep Convolutional Neural Network for Multitask Learning.低秩深度卷积神经网络的多任务学习
Comput Intell Neurosci. 2019 May 20;2019:7410701. doi: 10.1155/2019/7410701. eCollection 2019.
8
Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs.通过神经连接约束赢得彩票:使用空间约束稀疏循环神经网络在认知任务中实现更快学习。
Neural Comput. 2023 Oct 10;35(11):1850-1869. doi: 10.1162/neco_a_01613.
9
Structured pruning of recurrent neural networks through neuron selection.通过神经元选择对递归神经网络进行结构化剪枝。
Neural Netw. 2020 Mar;123:134-141. doi: 10.1016/j.neunet.2019.11.018. Epub 2019 Dec 5.
10
Recurrent Neural Networks With Auxiliary Memory Units.带辅助记忆单元的递归神经网络。
IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1652-1661. doi: 10.1109/TNNLS.2017.2677968. Epub 2017 Mar 21.

引用本文的文献

1
Graph-based vision transformer with sparsity for training on small datasets from scratch.基于图的具有稀疏性的视觉Transformer,可从零开始在小数据集上进行训练。
Sci Rep. 2025 Jul 8;15(1):24520. doi: 10.1038/s41598-025-10408-0.
2
Behavioral Classification of Sequential Neural Activity Using Time Varying Recurrent Neural Networks.使用时变递归神经网络对序列神经活动进行行为分类
IEEE Trans Neural Syst Rehabil Eng. 2025;33:2638-2649. doi: 10.1109/TNSRE.2025.3586175.