Suppr超能文献

具有Lipschitz连续激活函数和时滞的递归神经网络的绝对指数稳定性

Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays.

作者信息

Cao Jinde, Wang Jun

机构信息

Department of Mathematics, Southeast University, Nanjing 210096 Jiangsu, China.

出版信息

Neural Netw. 2004 Apr;17(3):379-90. doi: 10.1016/j.neunet.2003.08.007.

Abstract

This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapunov function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable.

摘要

本文研究了一类一般的时滞神经网络的绝对指数稳定性,这类神经网络仅要求激活函数部分满足利普希茨连续且单调非减,而不一定可微或有界。利用时滞哈莱奈型不等式和李雅普诺夫函数,推导了三个新的充分条件,以确定具有加性对角稳定互联矩阵的时滞神经网络的平衡点是否绝对指数稳定。这些稳定性准则也适用于激活函数通常不可微或无界的时滞优化神经网络和时滞细胞神经网络。本文的结果回答了一个问题:如果一个无任何时滞的神经网络是绝对指数稳定的,那么在什么附加条件下,具有时滞的神经网络也是绝对指数稳定的。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验