Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Lisbon, Portugal.
CSDC, Department of Physics and Astronomy, University of Florence, Sesto Fiorentino, Italy.
Sci Rep. 2022 Jul 1;12(1):11201. doi: 10.1038/s41598-022-14805-7.
Training of neural networks can be reformulated in spectral space, by allowing eigenvalues and eigenvectors of the network to act as target of the optimization instead of the individual weights. Working in this setting, we show that the eigenvalues can be used to rank the nodes' importance within the ensemble. Indeed, we will prove that sorting the nodes based on their associated eigenvalues, enables effective pre- and post-processing pruning strategies to yield massively compacted networks (in terms of the number of composing neurons) with virtually unchanged performance. The proposed methods are tested for different architectures, with just a single or multiple hidden layers, and against distinct classification tasks of general interest.
神经网络的训练可以在谱空间中重新表述,通过允许网络的特征值和特征向量作为优化的目标,而不是单个权重。在这种设置下,我们证明了特征值可以用于对集合中节点的重要性进行排序。实际上,我们将证明,根据它们相关的特征值对节点进行排序,可以实现有效的预处理和后处理剪枝策略,从而生成具有几乎不变性能的大规模紧凑网络(就组成神经元的数量而言)。所提出的方法针对不同的架构进行了测试,这些架构只有一个或多个隐藏层,并针对不同的一般感兴趣的分类任务进行了测试。