Suppr超能文献

E2FIF:利用端到端全精度信息流突破二值化深度图像超分辨率的极限。

E2FIF: Push the Limit of Binarized Deep Imagery Super-Resolution Using End-to-End Full-Precision Information Flow.

作者信息

Song Chongxing, Lang Zhiqiang, Wei Wei, Zhang Lei

出版信息

IEEE Trans Image Process. 2023;32:5379-5393. doi: 10.1109/TIP.2023.3315540. Epub 2023 Oct 5.

Abstract

Binary neural network (BNN) provides a promising solution to deploy parameter-intensive deep single image super-resolution (SISR) models onto real devices with limited storage and computational resources. To achieve comparable performance with the full-precision counterpart, most existing BNNs for SISR mainly focus on compensating for the information loss incurred by binarizing weights and activations in the network through better approximations to the binarized convolution. In this study, we revisit the difference between BNNs and their full-precision counterparts and argue that the key to good generalization performance of BNNs lies on preserving a complete full-precision information flow along with an accurate gradient flow passing through each binarized convolution layer. Inspired by this, we propose to introduce a full-precision skip connection, or a variant thereof, over each binarized convolution layer across the entire network, which can increase the forward expressive capability and the accuracy of back-propagated gradient, thus enhancing the generalization performance. More importantly, such a scheme can be applied to any existing BNN backbones for SISR without introducing any additional computation cost. To validate the efficacy of the proposed approach, we evaluate it using four different backbones for SISR on four benchmark datasets and report obviously superior performance over existing BNNs and even some 4-bit competitors.

摘要

二值神经网络(BNN)为将参数密集型的深度单图像超分辨率(SISR)模型部署到存储和计算资源有限的实际设备上提供了一个很有前景的解决方案。为了实现与全精度模型相当的性能,大多数现有的用于SISR的BNN主要专注于通过对二值化卷积进行更好的近似来补偿网络中权重和激活二值化所导致的信息损失。在本研究中,我们重新审视了BNN与其全精度对应模型之间的差异,并认为BNN良好泛化性能的关键在于保留完整的全精度信息流以及通过每个二值化卷积层的准确梯度流。受此启发,我们建议在整个网络的每个二值化卷积层上引入一个全精度跳跃连接或其变体,这可以提高前向表达能力和反向传播梯度的准确性,从而增强泛化性能。更重要的是,这样的方案可以应用于任何现有的用于SISR的BNN主干网络,而无需引入任何额外的计算成本。为了验证所提方法的有效性,我们在四个基准数据集上使用四种不同的用于SISR的主干网络对其进行评估,并报告了明显优于现有BNN甚至一些4位竞争对手的性能。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验