Cao Jinde, Wang Jun
Department of Mathematics, Southeast University, Nanjing 210096 Jiangsu, China.
Neural Netw. 2004 Apr;17(3):379-90. doi: 10.1016/j.neunet.2003.08.007.
This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapunov function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable.
本文研究了一类一般的时滞神经网络的绝对指数稳定性,这类神经网络仅要求激活函数部分满足利普希茨连续且单调非减,而不一定可微或有界。利用时滞哈莱奈型不等式和李雅普诺夫函数,推导了三个新的充分条件,以确定具有加性对角稳定互联矩阵的时滞神经网络的平衡点是否绝对指数稳定。这些稳定性准则也适用于激活函数通常不可微或无界的时滞优化神经网络和时滞细胞神经网络。本文的结果回答了一个问题:如果一个无任何时滞的神经网络是绝对指数稳定的,那么在什么附加条件下,具有时滞的神经网络也是绝对指数稳定的。