Suppr超能文献

全连接层的谱修剪。

Spectral pruning of fully connected layers.

机构信息

Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Lisbon, Portugal.

CSDC, Department of Physics and Astronomy, University of Florence, Sesto Fiorentino, Italy.

出版信息

Sci Rep. 2022 Jul 1;12(1):11201. doi: 10.1038/s41598-022-14805-7.

Abstract

Training of neural networks can be reformulated in spectral space, by allowing eigenvalues and eigenvectors of the network to act as target of the optimization instead of the individual weights. Working in this setting, we show that the eigenvalues can be used to rank the nodes' importance within the ensemble. Indeed, we will prove that sorting the nodes based on their associated eigenvalues, enables effective pre- and post-processing pruning strategies to yield massively compacted networks (in terms of the number of composing neurons) with virtually unchanged performance. The proposed methods are tested for different architectures, with just a single or multiple hidden layers, and against distinct classification tasks of general interest.

摘要

神经网络的训练可以在谱空间中重新表述,通过允许网络的特征值和特征向量作为优化的目标,而不是单个权重。在这种设置下,我们证明了特征值可以用于对集合中节点的重要性进行排序。实际上,我们将证明,根据它们相关的特征值对节点进行排序,可以实现有效的预处理和后处理剪枝策略,从而生成具有几乎不变性能的大规模紧凑网络(就组成神经元的数量而言)。所提出的方法针对不同的架构进行了测试,这些架构只有一个或多个隐藏层,并针对不同的一般感兴趣的分类任务进行了测试。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d564/9249877/6d31db1fc231/41598_2022_14805_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验