Suppr超能文献

具有时变延迟的神经网络全局渐近稳定性的容许延迟上界

Admissible Delay Upper Bounds for Global Asymptotic Stability of Neural Networks With Time-Varying Delays.

作者信息

Zhang Xian-Ming, Han Qing-Long, Wang Jun

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5319-5329. doi: 10.1109/TNNLS.2018.2797279. Epub 2018 Feb 16.

Abstract

This paper is concerned with global asymptotic stability of a neural network with a time-varying delay, where the delay function is differentiable uniformly bounded with delay-derivative bounded from above. First, a general reciprocally convex inequality is presented by introducing some slack vectors with flexible dimensions. This inequality provides a tighter bound in the form of a convex combination than some existing ones. Second, by constructing proper Lyapunov-Krasovskii functional, global asymptotic stability of the neural network is analyzed for two types of the time-varying delays depending on whether or not the lower bound of the delay derivative is known. Third, noticing that sufficient conditions on stability from estimation on the derivative of some Lyapunov-Krasovskii functional are affine both on the delay function and its derivative, allowable delay sets can be refined to produce less conservative stability criteria for the neural network under study. Finally, two numerical examples are given to substantiate the effectiveness of the proposed method.

摘要

本文关注具有时变延迟的神经网络的全局渐近稳定性,其中延迟函数可微且一致有界,延迟导数有上界。首先,通过引入一些具有灵活维度的松弛向量,给出了一个一般的互易凸不等式。该不等式以凸组合的形式提供了比一些现有不等式更紧的界。其次,通过构造适当的Lyapunov-Krasovskii泛函,针对时变延迟的两种类型,根据延迟导数的下界是否已知,分析了神经网络的全局渐近稳定性。第三,注意到从一些Lyapunov-Krasovskii泛函导数估计得到的稳定性充分条件在延迟函数及其导数上都是仿射的,可对允许延迟集进行细化,以产生针对所研究神经网络的保守性更低的稳定性准则。最后,给出两个数值例子来证实所提方法的有效性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验