Suppr超能文献

特征流正则化:改善深度神经网络中的结构化稀疏性。

Feature flow regularization: Improving structured sparsity in deep neural networks.

作者信息

Wu Yue, Lan Yuan, Zhang Luchan, Xiang Yang

机构信息

Department of Mathematics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong.

College of Mathematics and Statistics, Shenzhen University, Shenzhen 518060, China.

出版信息

Neural Netw. 2023 Apr;161:598-613. doi: 10.1016/j.neunet.2023.02.013. Epub 2023 Feb 13.

Abstract

Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks (DNNs) while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this paper, we propose a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. Experiments on CIFAR-10 and ImageNet datasets show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods.

摘要

剪枝是一种模型压缩方法,它在保持准确性的同时去除冗余参数并加速深度神经网络(DNN)的推理速度。大多数现有的剪枝方法直接对参数或特征施加各种条件。在本文中,我们从特征演化的新角度提出了一种简单有效的正则化策略,以改善DNN中的结构化稀疏性和结构化剪枝。具体而言,我们考虑连接相邻隐藏层特征的轨迹,即特征流。我们提出了特征流正则化(FFR)来惩罚轨迹的长度和总绝对曲率,这会隐式地增加参数的结构化稀疏性。FFR背后的原理是,短而直的轨迹将导致一个避免冗余参数的高效网络。在CIFAR-10和ImageNet数据集上的实验表明,FFR提高了结构化稀疏性,并取得了与现有最先进方法相当甚至更好的剪枝结果。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验