• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

相似文献

1
Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks.基于递归神经网络的自动调制分类的深度稀疏学习。
Sensors (Basel). 2021 Sep 25;21(19):6410. doi: 10.3390/s21196410.
2
Deep convolutional neural network and IoT technology for healthcare.用于医疗保健的深度卷积神经网络和物联网技术。
Digit Health. 2024 Jan 17;10:20552076231220123. doi: 10.1177/20552076231220123. eCollection 2024 Jan-Dec.
3
Memory-Based Pruning of Deep Neural Networks for IoT Devices Applied to Flood Detection.基于内存的深度神经网络剪枝在物联网设备中的应用于洪水检测。
Sensors (Basel). 2021 Nov 12;21(22):7506. doi: 10.3390/s21227506.
4
Transformed ℓ regularization for learning sparse deep neural networks.ℓ 正则化变换在稀疏深度神经网络学习中的应用。
Neural Netw. 2019 Nov;119:286-298. doi: 10.1016/j.neunet.2019.08.015. Epub 2019 Aug 27.
5
An Ensemble Deep Learning Model for Automatic Modulation Classification in 5G and Beyond IoT Networks.用于 5G 及物联网网络中自动调制分类的集成深度学习模型。
Comput Intell Neurosci. 2021 Dec 14;2021:5047355. doi: 10.1155/2021/5047355. eCollection 2021.
6
Attentive transformer deep learning algorithm for intrusion detection on IoT systems using automatic Xplainable feature selection.基于自动可解释特征选择的物联网系统入侵检测的注意 Transformer 深度学习算法。
PLoS One. 2023 Oct 16;18(10):e0286652. doi: 10.1371/journal.pone.0286652. eCollection 2023.
7
Biologically plausible deep learning - But how far can we go with shallow networks?生物学上合理的深度学习——但我们可以在浅层网络中走多远?
Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20.
8
Structured pruning of recurrent neural networks through neuron selection.通过神经元选择对递归神经网络进行结构化剪枝。
Neural Netw. 2020 Mar;123:134-141. doi: 10.1016/j.neunet.2019.11.018. Epub 2019 Dec 5.
9
EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks.EvoPruneDeepTL:一种用于基于迁移学习的深度神经网络的进化剪枝模型。
Neural Netw. 2023 Jan;158:59-82. doi: 10.1016/j.neunet.2022.10.011. Epub 2022 Nov 4.
10
A Method of Deep Learning Model Optimization for Image Classification on Edge Device.一种用于边缘设备图像分类的深度学习模型优化方法。
Sensors (Basel). 2022 Sep 27;22(19):7344. doi: 10.3390/s22197344.

本文引用的文献

1
Automatic Modulation Classification Based on Deep Learning for Unmanned Aerial Vehicles.基于深度学习的无人机自动调制分类
Sensors (Basel). 2018 Mar 20;18(3):924. doi: 10.3390/s18030924.
2
Pruning algorithms-a survey.剪枝算法——一项综述。
IEEE Trans Neural Netw. 1993;4(5):740-7. doi: 10.1109/72.248452.
3
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.

基于递归神经网络的自动调制分类的深度稀疏学习。

Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks.

机构信息

College of Biomedical Engineering and Instrument Science, Yuquan Campus, Zhejiang University, 38 Zheda Road, Hangzhou 310027, China.

Department of Biomedical Engineering, The Chinese University of Hong Kong, Shatin, New Territories, Hong Kong, China.

出版信息

Sensors (Basel). 2021 Sep 25;21(19):6410. doi: 10.3390/s21196410.

DOI:10.3390/s21196410
PMID:34640730
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8512957/
Abstract

Deep learning models, especially recurrent neural networks (RNNs), have been successfully applied to automatic modulation classification (AMC) problems recently. However, deep neural networks are usually overparameterized, i.e., most of the connections between neurons are redundant. The large model size hinders the deployment of deep neural networks in applications such as Internet-of-Things (IoT) networks. Therefore, reducing parameters without compromising the network performance via sparse learning is often desirable since it can alleviates the computational and storage burdens of deep learning models. In this paper, we propose a sparse learning algorithm that can directly train a sparsely connected neural network based on the statistics of weight magnitude and gradient momentum. We first used the MNIST and CIFAR10 datasets to demonstrate the effectiveness of this method. Subsequently, we applied it to RNNs with different pruning strategies on recurrent and non-recurrent connections for AMC problems. Experimental results demonstrated that the proposed method can effectively reduce the parameters of the neural networks while maintaining model performance. Moreover, we show that appropriate sparsity can further improve network generalization ability.

摘要

深度学习模型,特别是循环神经网络(RNN),最近已成功应用于自动调制分类(AMC)问题。然而,深度神经网络通常是过参数化的,即神经元之间的大多数连接都是冗余的。大型模型尺寸阻碍了深度神经网络在物联网(IoT)网络等应用中的部署。因此,通过稀疏学习在不影响网络性能的情况下减少参数通常是可取的,因为它可以减轻深度学习模型的计算和存储负担。在本文中,我们提出了一种稀疏学习算法,该算法可以根据权重幅度和梯度动量的统计信息直接训练稀疏连接的神经网络。我们首先使用 MNIST 和 CIFAR10 数据集来证明该方法的有效性。随后,我们将其应用于具有不同剪枝策略的 RNN 上,以解决 AMC 问题中递归和非递归连接的问题。实验结果表明,所提出的方法可以在保持模型性能的同时,有效地减少神经网络的参数。此外,我们表明适当的稀疏性可以进一步提高网络的泛化能力。