• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

BNET:带增强线性变换的批量归一化。

BNET: Batch Normalization With Enhanced Linear Transformation.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Jul;45(7):9225-9232. doi: 10.1109/TPAMI.2023.3235369. Epub 2023 Jun 5.

DOI:10.1109/TPAMI.2023.3235369
PMID:37018583
Abstract

Batch normalization (BN) is a fundamental unit in modern deep neural networks. However, BN and its variants focus on normalization statistics but neglect the recovery step that uses linear transformation to improve the capacity of fitting complex data distributions. In this paper, we demonstrate that the recovery step can be improved by aggregating the neighborhood of each neuron rather than just considering a single neuron. Specifically, we propose a simple yet effective method named batch normalization with enhanced linear transformation (BNET) to embed spatial contextual information and improve representation ability. BNET can be easily implemented using the depth-wise convolution and seamlessly transplanted into existing architectures with BN. To our best knowledge, BNET is the first attempt to enhance the recovery step for BN. Furthermore, BN is interpreted as a special case of BNET from both spatial and spectral views. Experimental results demonstrate that BNET achieves consistent performance gains based on various backbones in a wide range of visual tasks. Moreover, BNET can accelerate the convergence of network training and enhance spatial information by assigning important neurons with large weights accordingly.

摘要

批量归一化(BN)是现代深度神经网络的基本单元。然而,BN 及其变体专注于归一化统计信息,但忽略了使用线性变换来提高拟合复杂数据分布能力的恢复步骤。在本文中,我们证明通过聚合每个神经元的邻域而不是仅仅考虑单个神经元,可以改进恢复步骤。具体来说,我们提出了一种简单而有效的方法,名为具有增强线性变换的批量归一化(BNET),以嵌入空间上下文信息并提高表示能力。BNET 可以使用深度卷积轻松实现,并可以与具有 BN 的现有架构无缝移植。据我们所知,BNET 是首次尝试增强 BN 的恢复步骤。此外,从空间和谱的角度来看,BN 被解释为 BNET 的一个特例。实验结果表明,BNET 在各种视觉任务中基于各种骨干网络都实现了一致的性能提升。此外,BNET 可以通过为重要神经元分配较大的权重来加速网络训练的收敛并增强空间信息。

相似文献

1
BNET: Batch Normalization With Enhanced Linear Transformation.BNET:带增强线性变换的批量归一化。
IEEE Trans Pattern Anal Mach Intell. 2023 Jul;45(7):9225-9232. doi: 10.1109/TPAMI.2023.3235369. Epub 2023 Jun 5.
2
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
3
Why Batch Normalization Damage Federated Learning on Non-IID Data?为什么批量归一化会损害非独立同分布数据上的联邦学习?
IEEE Trans Neural Netw Learn Syst. 2023 Nov 1;PP. doi: 10.1109/TNNLS.2023.3323302.
4
Training Faster by Separating Modes of Variation in Batch-Normalized Models.通过分离批归一化模型中的变化模式实现更快训练。
IEEE Trans Pattern Anal Mach Intell. 2020 Jun;42(6):1483-1500. doi: 10.1109/TPAMI.2019.2895781. Epub 2019 Jan 28.
5
Re-Thinking the Effectiveness of Batch Normalization and Beyond.重新思考批归一化的有效性及其他
IEEE Trans Pattern Anal Mach Intell. 2024 Jan;46(1):465-478. doi: 10.1109/TPAMI.2023.3319005. Epub 2023 Dec 5.
6
Diminishing Batch Normalization.递减批量归一化
IEEE Trans Neural Netw Learn Syst. 2024 May;35(5):6544-6557. doi: 10.1109/TNNLS.2022.3210840. Epub 2024 May 2.
7
L1 -Norm Batch Normalization for Efficient Training of Deep Neural Networks.L1-范数批归一化在深度神经网络高效训练中的应用。
IEEE Trans Neural Netw Learn Syst. 2019 Jul;30(7):2043-2051. doi: 10.1109/TNNLS.2018.2876179. Epub 2018 Nov 9.
8
PMONN: an optical neural network for photonic integrated circuits based on micro-resonator.PMONN:一种基于微谐振器的用于光子集成电路的光学神经网络。
Opt Express. 2024 Feb 26;32(5):7832-7847. doi: 10.1364/OE.511245.
9
Effective and Efficient Batch Normalization Using a Few Uncorrelated Data for Statistics Estimation.利用少量不相关数据进行统计估计的高效有效批归一化
IEEE Trans Neural Netw Learn Syst. 2021 Jan;32(1):348-362. doi: 10.1109/TNNLS.2020.2978753. Epub 2021 Jan 4.
10
Instance Segmentation Based on Improved Self-Adaptive Normalization.基于改进的自适应归一化的实例分割
Sensors (Basel). 2022 Jun 10;22(12):4396. doi: 10.3390/s22124396.

引用本文的文献

1
Deep reinforcement learning based low energy consumption scheduling approach design for urban electric logistics vehicle networks.基于深度强化学习的城市电动物流车辆网络低能耗调度方法设计
Sci Rep. 2025 Mar 15;15(1):9003. doi: 10.1038/s41598-025-92916-7.
2
ASD-SWNet: a novel shared-weight feature extraction and classification network for autism spectrum disorder diagnosis.ASD-SWNet:一种用于自闭症谱系障碍诊断的新型共享权重特征提取与分类网络。
Sci Rep. 2024 Jun 13;14(1):13696. doi: 10.1038/s41598-024-64299-8.
3
MARR-GAN: Memristive Attention Recurrent Residual Generative Adversarial Network for Raindrop Removal.
MARR-GAN:用于雨滴去除的忆阻注意力循环残差生成对抗网络
Micromachines (Basel). 2024 Jan 31;15(2):217. doi: 10.3390/mi15020217.