Suppr超能文献

具有时变延迟的神经网络全局渐近稳定性的容许延迟上界

Admissible Delay Upper Bounds for Global Asymptotic Stability of Neural Networks With Time-Varying Delays.

作者信息

Zhang Xian-Ming, Han Qing-Long, Wang Jun

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5319-5329. doi: 10.1109/TNNLS.2018.2797279. Epub 2018 Feb 16.

Abstract

This paper is concerned with global asymptotic stability of a neural network with a time-varying delay, where the delay function is differentiable uniformly bounded with delay-derivative bounded from above. First, a general reciprocally convex inequality is presented by introducing some slack vectors with flexible dimensions. This inequality provides a tighter bound in the form of a convex combination than some existing ones. Second, by constructing proper Lyapunov-Krasovskii functional, global asymptotic stability of the neural network is analyzed for two types of the time-varying delays depending on whether or not the lower bound of the delay derivative is known. Third, noticing that sufficient conditions on stability from estimation on the derivative of some Lyapunov-Krasovskii functional are affine both on the delay function and its derivative, allowable delay sets can be refined to produce less conservative stability criteria for the neural network under study. Finally, two numerical examples are given to substantiate the effectiveness of the proposed method.

摘要

本文关注具有时变延迟的神经网络的全局渐近稳定性,其中延迟函数可微且一致有界,延迟导数有上界。首先,通过引入一些具有灵活维度的松弛向量,给出了一个一般的互易凸不等式。该不等式以凸组合的形式提供了比一些现有不等式更紧的界。其次,通过构造适当的Lyapunov-Krasovskii泛函,针对时变延迟的两种类型,根据延迟导数的下界是否已知,分析了神经网络的全局渐近稳定性。第三,注意到从一些Lyapunov-Krasovskii泛函导数估计得到的稳定性充分条件在延迟函数及其导数上都是仿射的,可对允许延迟集进行细化,以产生针对所研究神经网络的保守性更低的稳定性准则。最后,给出两个数值例子来证实所提方法的有效性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验