Suppr超能文献

使用遗忘因子和滑模的等效神经网络最优系数。

Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes.

机构信息

Centro de Investigación en Computación, Instituto Politécnico Nacional (CIC-IPN), Avenida Juan de Dios Bátiz, Esq. Miguel Othón de Mendizábal, Col. Nueva Industrial Vallejo, Delegación Gustavo A. Madero, 07738 Ciudad de México, Mexico.

出版信息

Comput Intell Neurosci. 2016;2016:4642052. doi: 10.1155/2016/4642052. Epub 2016 Dec 13.

Abstract

The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied.

摘要

人工神经网络(ANN)的概念在方法中很常见,这些方法的任务例如是识别或逼近复杂系统的输出,这些系统很难建模。通常,目标是在线确定合适的参数以达到更好的逐点收敛速度,因此本文提出了等效 ANN(EANN)的参数估计,为随机系统进行递归识别,首先是具有恒定参数,其次是具有非平稳输出系统条件。因此,在最后一次估计中,参数也具有随机特性,这使得传统的近似方法由于收敛速度的丧失而不合适。为了解决这个问题,我们提出了一种具有滑模的非恒定指数遗忘因子(NCEFF),在几乎所有点都得到了递减的指数收敛速度。使用 MATLAB®进行了两个识别阶段的理论结果,并进行了比较,观察到在应用新的非平稳输出条件的建议时有所改进。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38b9/5187595/d23900d55104/CIN2016-4642052.001.jpg

相似文献

2
-Iterative Exponential Forgetting Factor for EEG Signals Parameter Estimation.- EEG 信号参数估计的迭代指数遗忘因子。
Comput Intell Neurosci. 2018 Jan 15;2018:4613740. doi: 10.1155/2018/4613740. eCollection 2018.
3
Convergence analysis of online gradient method for BP neural networks.BP 神经网络在线梯度法的收敛性分析。
Neural Netw. 2011 Jan;24(1):91-8. doi: 10.1016/j.neunet.2010.09.007. Epub 2010 Sep 16.
4
Neural Net Gains Estimation Based on an Equivalent Model.基于等效模型的神经网络增益估计
Comput Intell Neurosci. 2016;2016:1690924. doi: 10.1155/2016/1690924. Epub 2016 Jun 5.
7
Stability of Cohen-Grossberg neural networks with time-varying delays.具有时变延迟的Cohen-Grossberg神经网络的稳定性
Neural Netw. 2007 Oct;20(8):868-73. doi: 10.1016/j.neunet.2007.07.005. Epub 2007 Jul 28.

引用本文的文献

1
-Iterative Exponential Forgetting Factor for EEG Signals Parameter Estimation.- EEG 信号参数估计的迭代指数遗忘因子。
Comput Intell Neurosci. 2018 Jan 15;2018:4613740. doi: 10.1155/2018/4613740. eCollection 2018.

本文引用的文献

1
Neural Net Gains Estimation Based on an Equivalent Model.基于等效模型的神经网络增益估计
Comput Intell Neurosci. 2016;2016:1690924. doi: 10.1155/2016/1690924. Epub 2016 Jun 5.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验