Suppr超能文献

基于膝关节引导的深度神经网络压缩进化算法。

A Knee-Guided Evolutionary Algorithm for Compressing Deep Neural Networks.

出版信息

IEEE Trans Cybern. 2021 Mar;51(3):1626-1638. doi: 10.1109/TCYB.2019.2928174. Epub 2021 Feb 17.

Abstract

Deep neural networks (DNNs) have been regarded as fundamental tools for many disciplines. Meanwhile, they are known for their large-scale parameters, high redundancy in weights, and extensive computing resource consumptions, which pose a tremendous challenge to the deployment in real-time applications or on resource-constrained devices. To cope with this issue, compressing DNNs for accelerating its inference has drawn extensive interest recently. The basic idea is to prune parameters with little performance degradation. However, the overparameterized nature and the conflict between parameters reduction and performance maintenance make it prohibitive to manually search the pruning parameter space. In this paper, we formally establish filter pruning as a multiobjective optimization problem, and propose a knee-guided evolutionary algorithm (KGEA) that can automatically search for the solution with quality tradeoff between the scale of parameters and performance, in which both conflicting objectives can be optimized simultaneously. In particular, by incorporating a minimum Manhattan distance approach, the search effort in the proposed KGEA is explicitly guided toward the knee area, which greatly facilitates the manual search for a good tradeoff solution. Moreover, the parameter importance is directly estimated on the criterion of performance loss, which can robustly identify the redundancy. In addition to the knee solution, a performance-improved model can also be found in a fine-tuning-free fashion. The experiments on compressing fully convolutional LeNet and VGG-19 networks validate the superiority of the proposed algorithm over the state-of-the-art competing methods.

摘要

深度神经网络(DNN)被视为许多学科的基本工具。同时,它们的参数规模大、权重冗余度高、计算资源消耗大,这给其在实时应用或资源受限设备中的部署带来了巨大挑战。为了解决这个问题,最近人们对压缩 DNN 以加速其推理产生了浓厚的兴趣。基本思想是剪枝参数而不降低性能。然而,超参数化的性质以及参数减少和性能维护之间的冲突使得手动搜索剪枝参数空间变得非常困难。在本文中,我们正式将滤波器剪枝建模为一个多目标优化问题,并提出了一种基于膝盖引导的进化算法(KGEA),它可以自动搜索具有参数规模和性能之间质量权衡的解决方案,其中两个冲突的目标可以同时得到优化。特别是,通过引入最小曼哈顿距离方法,所提出的 KGEA 的搜索工作被明确引导到膝盖区域,这极大地促进了手动搜索良好的权衡解决方案。此外,参数重要性是根据性能损失的标准直接估计的,这可以稳健地识别冗余。除了膝盖解决方案外,还可以以无需微调的方式找到性能提高的模型。在压缩全卷积 LeNet 和 VGG-19 网络的实验中,验证了所提出的算法优于最先进的竞争方法。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验