• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

噪声增强的双向反向传播和对抗学习。

Noise-boosted bidirectional backpropagation and adversarial learning.

机构信息

Department of Electrical and Computer Engineering, Signal and Image Processing Institute, University of Southern California, Los Angeles, CA 90089-2564, USA.

Department of Electrical and Computer Engineering, Signal and Image Processing Institute, University of Southern California, Los Angeles, CA 90089-2564, USA.

出版信息

Neural Netw. 2019 Dec;120:9-31. doi: 10.1016/j.neunet.2019.09.016. Epub 2019 Oct 17.

DOI:10.1016/j.neunet.2019.09.016
PMID:31753425
Abstract

Bidirectional backpropagation trains a neural network with backpropagation in both the backward and forward directions using the same synaptic weights. Special injected noise can then improve the algorithm's training time and accuracy because backpropagation has a likelihood structure. Training in each direction is a form of generalized expectation-maximization because backpropagation itself is a form of generalized expectation-maximization. This requires backpropagation invariance in each direction: The gradient log-likelihood in each direction must give back the original update equations of the backpropagation algorithm. The special noise makes the current training signal more probable as bidirectional backpropagation climbs the nearest hill of joint probability or log-likelihood. The noise for injection differs for classification and regression even in the same network because of the constraint of backpropagation invariance. The backward pass in a bidirectionally trained classifier estimates the centroid of the input pattern class. So the feedback signal that arrives back at the input layer of a classifier tends to estimate the local pattern-class centroid. Simulations show that noise speeded convergence and improved the accuracy of bidirectional backpropagation on both the MNIST test set of hand-written digits and the CIFAR-10 test set of images. The noise boost further applies to regular and Wasserstein bidirectionally trained adversarial networks. Bidirectionality also greatly reduced the problem of mode collapse in regular adversarial networks.

摘要

双向反向传播使用相同的突触权重,在反向和正向两个方向上进行反向传播训练神经网络。特殊的注入噪声可以提高算法的训练时间和准确性,因为反向传播具有似然结构。在每个方向上的训练都是广义期望最大化的一种形式,因为反向传播本身就是广义期望最大化的一种形式。这需要在每个方向上保持反向传播不变性:每个方向的梯度对数似然必须返回反向传播算法的原始更新方程。特殊的噪声使得当前的训练信号在双向反向传播中更有可能沿着联合概率或对数似然的最近的山丘攀升。由于反向传播不变性的约束,即使在同一个网络中,用于分类和回归的注入噪声也不同。在双向训练的分类器中,后向传递估计输入模式类的质心。因此,反馈信号到达分类器的输入层,往往会估计局部模式类的质心。模拟表明,噪声加速了收敛,并提高了 MNIST 手写数字测试集和 CIFAR-10 图像测试集上双向反向传播的准确性。噪声提升进一步适用于正则和 Wasserstein 双向训练的对抗网络。双向性还极大地减少了正则对抗网络中模式崩溃的问题。

相似文献

1
Noise-boosted bidirectional backpropagation and adversarial learning.噪声增强的双向反向传播和对抗学习。
Neural Netw. 2019 Dec;120:9-31. doi: 10.1016/j.neunet.2019.09.016. Epub 2019 Oct 17.
2
Noise can speed backpropagation learning and deep bidirectional pretraining.噪声可以加速反向传播学习和深度双向预训练。
Neural Netw. 2020 Sep;129:359-384. doi: 10.1016/j.neunet.2020.04.004. Epub 2020 Apr 11.
3
Noise-enhanced convolutional neural networks.噪声增强卷积神经网络。
Neural Netw. 2016 Jun;78:15-23. doi: 10.1016/j.neunet.2015.09.014. Epub 2015 Oct 19.
4
Biologically plausible deep learning - But how far can we go with shallow networks?生物学上合理的深度学习——但我们可以在浅层网络中走多远?
Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20.
5
Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks.通过二进制状态网络中的流水线截断误差反向传播实现硬件高效在线学习。
Front Neurosci. 2017 Sep 6;11:496. doi: 10.3389/fnins.2017.00496. eCollection 2017.
6
Training Robust Deep Neural Networks via Adversarial Noise Propagation.通过对抗噪声传播训练稳健的深度神经网络。
IEEE Trans Image Process. 2021;30:5769-5781. doi: 10.1109/TIP.2021.3082317.
7
Photons guided by axons may enable backpropagation-based learning in the brain.轴突引导的光子可能使大脑中的反向传播学习成为可能。
Sci Rep. 2022 Dec 1;12(1):20720. doi: 10.1038/s41598-022-24871-6.
8
Three learning phases for radial-basis-function networks.径向基函数网络的三个学习阶段。
Neural Netw. 2001 May;14(4-5):439-58. doi: 10.1016/s0893-6080(01)00027-2.
9
Handwritten Digit Recognition Using Nearest-Neighbor, Radial-Basis Function, and Backpropagation Neural Networks.使用最近邻、径向基函数和反向传播神经网络的手写数字识别
Neural Comput. 1991 Fall;3(3):440-449. doi: 10.1162/neco.1991.3.3.440.
10
Low-variance Forward Gradients using Direct Feedback Alignment and momentum.使用直接反馈对齐和动量的低方差前向梯度。
Neural Netw. 2024 Jan;169:572-583. doi: 10.1016/j.neunet.2023.10.051. Epub 2023 Nov 4.