Suppr超能文献

用于解决优化及相关问题的递归神经网络的全局指数稳定性。

Global exponential stability of recurrent neural networks for solving optimization and related problems.

作者信息

Xia Y, Wang J

机构信息

Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong, Shatin, NT, Hong Kong.

出版信息

IEEE Trans Neural Netw. 2000;11(4):1017-22. doi: 10.1109/72.857782.

Abstract

Global exponential stability is a desirable property for dynamic systems. This paper studies the global exponential stability of several existing recurrent neural networks for solving linear programming problems, convex programming problems with interval constraints, convex programming problems with nonlinear constraints, and monotone variational inequalities. In contrast to the existing results on global exponential stability, the present results do not require additional conditions on the weight matrices of recurrent neural networks and improve some existing conditions for global exponential stability. Therefore, the stability results in this paper further demonstrate the superior convergence properties of the existing neural networks for optimization.

摘要

全局指数稳定性是动态系统所期望的一种性质。本文研究了几种用于求解线性规划问题、具有区间约束的凸规划问题、具有非线性约束的凸规划问题以及单调变分不等式的现有递归神经网络的全局指数稳定性。与现有关于全局指数稳定性的结果相比,本文结果不需要对递归神经网络的权重矩阵附加额外条件,并且改进了一些现有的全局指数稳定性条件。因此,本文的稳定性结果进一步证明了现有用于优化的神经网络具有优越的收敛特性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验