• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

时变乘性噪声对DNN-kWTA模型的影响。

Effect of Time-Varying Multiplicative Noise on DNN-kWTA Model.

作者信息

Lu Wenhao, Zheng Yuanjin, Leung Chi-Sing

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Dec;35(12):18922-18930. doi: 10.1109/TNNLS.2023.3317135. Epub 2024 Dec 2.

DOI:10.1109/TNNLS.2023.3317135
PMID:37796669
Abstract

Among many -winners-take-all ( WTA) models, the dual-neural network (DNN- WTA) model is with significantly less number of connections. However, for analog realization, noise is inevitable and affects the operational correctness of the WTA process. Most existing results focus on the effect of additive noise. This brief studies the effect of time-varying multiplicative input noise. Two scenarios are considered. The first one is the bounded noise case, in which only the noise range is known. Another one is for the general noise distribution case, in which we either know the noise distribution or have noise samples. For each scenario, we first prove the convergence property of the DNN- WTA model under multiplicative input noise and then provide an efficient method to determine whether a noise-affected DNN- WTA network performs the correct WTA process for a given set of inputs. With the two methods, we can efficiently measure the probability of the network performing the correct WTA process. In addition, for the case of the inputs being uniformly distributed, we derive two closed-form expressions, one for each scenario, for estimating the probability of the model having correct operation. Finally, we conduct simulations to verify our theoretical results.

摘要

在众多赢家通吃(WTA)模型中,双神经网络(DNN-WTA)模型的连接数显著更少。然而,对于模拟实现而言,噪声是不可避免的,并且会影响WTA过程的操作正确性。大多数现有结果关注的是加性噪声的影响。本简报研究时变乘性输入噪声的影响。考虑了两种情况。第一种是有界噪声情况,其中仅噪声范围已知。另一种是一般噪声分布情况,其中我们要么知道噪声分布,要么有噪声样本。对于每种情况,我们首先证明乘性输入噪声下DNN-WTA模型的收敛特性,然后提供一种有效方法来确定受噪声影响的DNN-WTA网络对于给定的一组输入是否执行正确的WTA过程。通过这两种方法,我们可以有效地测量网络执行正确WTA过程的概率。此外,对于输入均匀分布的情况,我们针对每种情况推导了两个闭式表达式,用于估计模型正确运行的概率。最后,我们进行仿真以验证我们的理论结果。

相似文献

1
Effect of Time-Varying Multiplicative Noise on DNN-kWTA Model.时变乘性噪声对DNN-kWTA模型的影响。
IEEE Trans Neural Netw Learn Syst. 2024 Dec;35(12):18922-18930. doi: 10.1109/TNNLS.2023.3317135. Epub 2024 Dec 2.
2
DNN-kWTA With Bounded Random Offset Voltage Drifts in Threshold Logic Units.阈值逻辑单元中具有有界随机偏移电压漂移的深度神经网络-kWTA
IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3184-3192. doi: 10.1109/TNNLS.2021.3050493. Epub 2022 Jul 6.
3
Influence of Imperfections on the Operational Correctness of DNN-kWTA Model.缺陷对DNN-kWTA模型运行正确性的影响。
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):15021-15029. doi: 10.1109/TNNLS.2023.3281523. Epub 2024 Oct 7.
4
Robustness Analysis on Dual Neural Network-based $k$ WTA With Input Noise.基于双神经网络的带输入噪声的$k$ WTA 的鲁棒性分析。
IEEE Trans Neural Netw Learn Syst. 2018 Apr;29(4):1082-1094. doi: 10.1109/TNNLS.2016.2645602. Epub 2017 Feb 6.
5
Properties and Performance of Imperfect Dual Neural Network-Based kWTA Networks.基于非理想对偶神经网络的 kWTA 网络的性质与性能。
IEEE Trans Neural Netw Learn Syst. 2015 Sep;26(9):2188-93. doi: 10.1109/TNNLS.2014.2358851. Epub 2014 Nov 3.
6
Analysis on the convergence time of dual neural network-based kWTA.基于双神经网络的 kWTA 收敛时间分析。
IEEE Trans Neural Netw Learn Syst. 2012 Apr;23(4):676-82. doi: 10.1109/TNNLS.2012.2186315.
7
On Wang $k$ WTA With Input Noise, Output Node Stochastic, and Recurrent State Noise.关于具有输入噪声、输出节点随机和递归状态噪声的王 $k$ WTA。
IEEE Trans Neural Netw Learn Syst. 2018 Sep;29(9):4212-4222. doi: 10.1109/TNNLS.2017.2759905. Epub 2017 Oct 27.
8
Initialization-Based k-Winners-Take-All Neural Network Model Using Modified Gradient Descent.基于初始化的采用改进梯度下降法的k胜者全得神经网络模型
IEEE Trans Neural Netw Learn Syst. 2023 Aug;34(8):4130-4138. doi: 10.1109/TNNLS.2021.3123240. Epub 2023 Aug 4.
9
A novel recurrent neural network with one neuron and finite-time convergence for k-winners-take-all operation.一种具有单个神经元且用于k胜者全得操作的有限时间收敛的新型递归神经网络。
IEEE Trans Neural Netw. 2010 Jul;21(7):1140-8. doi: 10.1109/TNN.2010.2050781.
10
A general mean-based iterative winner-take-all neural network.一种基于均值的通用迭代胜者全得神经网络。
IEEE Trans Neural Netw. 1995;6(1):14-24. doi: 10.1109/72.363454.