• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

全连接层的谱修剪。

Spectral pruning of fully connected layers.

机构信息

Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Lisbon, Portugal.

CSDC, Department of Physics and Astronomy, University of Florence, Sesto Fiorentino, Italy.

出版信息

Sci Rep. 2022 Jul 1;12(1):11201. doi: 10.1038/s41598-022-14805-7.

DOI:10.1038/s41598-022-14805-7
PMID:35778586
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9249877/
Abstract

Training of neural networks can be reformulated in spectral space, by allowing eigenvalues and eigenvectors of the network to act as target of the optimization instead of the individual weights. Working in this setting, we show that the eigenvalues can be used to rank the nodes' importance within the ensemble. Indeed, we will prove that sorting the nodes based on their associated eigenvalues, enables effective pre- and post-processing pruning strategies to yield massively compacted networks (in terms of the number of composing neurons) with virtually unchanged performance. The proposed methods are tested for different architectures, with just a single or multiple hidden layers, and against distinct classification tasks of general interest.

摘要

神经网络的训练可以在谱空间中重新表述,通过允许网络的特征值和特征向量作为优化的目标,而不是单个权重。在这种设置下,我们证明了特征值可以用于对集合中节点的重要性进行排序。实际上,我们将证明,根据它们相关的特征值对节点进行排序,可以实现有效的预处理和后处理剪枝策略,从而生成具有几乎不变性能的大规模紧凑网络(就组成神经元的数量而言)。所提出的方法针对不同的架构进行了测试,这些架构只有一个或多个隐藏层,并针对不同的一般感兴趣的分类任务进行了测试。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/865bbd8314c8/41598_2022_14805_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/6d31db1fc231/41598_2022_14805_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/0601b68054da/41598_2022_14805_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/3a20a3aa992b/41598_2022_14805_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/325365532128/41598_2022_14805_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/865bbd8314c8/41598_2022_14805_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/6d31db1fc231/41598_2022_14805_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/0601b68054da/41598_2022_14805_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/3a20a3aa992b/41598_2022_14805_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/325365532128/41598_2022_14805_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/865bbd8314c8/41598_2022_14805_Fig5_HTML.jpg

相似文献

1
Spectral pruning of fully connected layers.全连接层的谱修剪。
Sci Rep. 2022 Jul 1;12(1):11201. doi: 10.1038/s41598-022-14805-7.
2
Training of sparse and dense deep neural networks: Fewer parameters, same performance.稀疏和密集深度神经网络的训练:参数更少,性能相同。
Phys Rev E. 2021 Nov;104(5-1):054312. doi: 10.1103/PhysRevE.104.054312.
3
Weak sub-network pruning for strong and efficient neural networks.弱子网络剪枝技术:构建强大而高效的神经网络
Neural Netw. 2021 Dec;144:614-626. doi: 10.1016/j.neunet.2021.09.015. Epub 2021 Sep 30.
4
Task-specific feature extraction and classification of fMRI volumes using a deep neural network initialized with a deep belief network: Evaluation using sensorimotor tasks.使用由深度信念网络初始化的深度神经网络对功能磁共振成像(fMRI)体积进行特定任务特征提取和分类:基于感觉运动任务的评估
Neuroimage. 2017 Jan 15;145(Pt B):314-328. doi: 10.1016/j.neuroimage.2016.04.003. Epub 2016 Apr 11.
5
One-Shot Neural Architecture Search by Dynamically Pruning Supernet in Hierarchical Order.分层动态剪枝超网的单步神经架构搜索。
Int J Neural Syst. 2021 Jul;31(7):2150029. doi: 10.1142/S0129065721500295. Epub 2021 Jun 14.
6
Pruning recurrent neural networks for improved generalization performance.通过剪枝循环神经网络提高泛化性能。
IEEE Trans Neural Netw. 1994;5(5):848-51. doi: 10.1109/72.317740.
7
Deep neural network with weight sparsity control and pre-training extracts hierarchical features and enhances classification performance: Evidence from whole-brain resting-state functional connectivity patterns of schizophrenia.具有权重稀疏控制和预训练的深度神经网络提取分层特征并提高分类性能:来自精神分裂症全脑静息态功能连接模式的证据。
Neuroimage. 2016 Jan 1;124(Pt A):127-146. doi: 10.1016/j.neuroimage.2015.05.018. Epub 2015 May 15.
8
The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.发育修剪的信息理论:使用局部突触规则优化全局网络结构。
PLoS Comput Biol. 2021 Oct 11;17(10):e1009458. doi: 10.1371/journal.pcbi.1009458. eCollection 2021 Oct.
9
Small Network for Lightweight Task in Computer Vision: A Pruning Method Based on Feature Representation.用于计算机视觉中轻量级任务的小型网络:一种基于特征表示的剪枝方法
Comput Intell Neurosci. 2021 Apr 17;2021:5531023. doi: 10.1155/2021/5531023. eCollection 2021.
10
Structured pruning of recurrent neural networks through neuron selection.通过神经元选择对递归神经网络进行结构化剪枝。
Neural Netw. 2020 Mar;123:134-141. doi: 10.1016/j.neunet.2019.11.018. Epub 2019 Dec 5.

引用本文的文献

1
Optimized ensemble deep learning for predictive analysis of student achievement.优化集成深度学习在学生成绩预测分析中的应用。
PLoS One. 2024 Aug 26;19(8):e0309141. doi: 10.1371/journal.pone.0309141. eCollection 2024.
2
A geometric approach for accelerating neural networks designed for classification problems.一种用于加速针对分类问题设计的神经网络的几何方法。
Sci Rep. 2024 Jul 30;14(1):17590. doi: 10.1038/s41598-024-68172-6.
3
Stochastic Gradient Descent-like relaxation is equivalent to Metropolis dynamics in discrete optimization and inference problems.

本文引用的文献

1
Training of sparse and dense deep neural networks: Fewer parameters, same performance.稀疏和密集深度神经网络的训练:参数更少,性能相同。
Phys Rev E. 2021 Nov;104(5-1):054312. doi: 10.1103/PhysRevE.104.054312.
2
Machine learning in spectral domain.光谱域中的机器学习。
Nat Commun. 2021 Feb 26;12(1):1330. doi: 10.1038/s41467-021-21481-0.
3
A mechanism for homeostatic plasticity.一种稳态可塑性机制。
在离散优化和推理问题中,类似随机梯度下降的松弛方法等同于 metropolis 动力学。
Sci Rep. 2024 May 21;14(1):11638. doi: 10.1038/s41598-024-62625-8.
4
Fault Diagnosis of the Autonomous Driving Perception System Based on Information Fusion.基于信息融合的自动驾驶感知系统故障诊断
Sensors (Basel). 2023 May 26;23(11):5110. doi: 10.3390/s23115110.
Nat Neurosci. 2004 Jul;7(7):691-2. doi: 10.1038/nn0704-691.