Liang X B
Dept. of Electr. and Comput. Eng., Delaware Univ., Newark, DE.
IEEE Trans Neural Netw. 2001;12(6):1521-5. doi: 10.1109/72.963790.
We investigate the qualitative properties of a recurrent neural network (RNN) for minimizing a nonlinear continuously differentiable and convex objective function over any given nonempty, closed, and convex subset which may be bounded or unbounded, by exploiting some key inequalities in mathematical programming. The global existence and boundedness of the solution of the RNN are proved when the objective function is convex and has a nonempty constrained minimum set. Under the same assumption, the RNN is shown to be globally convergent in the sense that every trajectory of the RNN converges to some equilibrium point of the RNN. If the objective function itself is uniformly convex and its gradient vector is a locally Lipschitz continuous mapping, then the RNN is globally exponentially convergent in the sense that every trajectory of the RNN converges to the unique equilibrium point of the RNN exponentially. These qualitative properties of the RNN render the network model well suitable for solving the convex minimization over any given nonempty, closed, and convex subset, no matter whether the given constrained subset is bounded or not.
我们通过利用数学规划中的一些关键不等式,研究循环神经网络(RNN)在任何给定的非空、闭且凸的子集(可能有界或无界)上最小化非线性连续可微凸目标函数的定性性质。当目标函数是凸函数且有非空约束最小集时,证明了RNN解的全局存在性和有界性。在相同假设下,RNN在每个轨迹都收敛到RNN的某个平衡点的意义上被证明是全局收敛的。如果目标函数本身是一致凸的且其梯度向量是局部Lipschitz连续映射,那么RNN在每个轨迹都指数收敛到RNN的唯一平衡点的意义上是全局指数收敛的。RNN的这些定性性质使得该网络模型非常适合解决在任何给定的非空、闭且凸的子集上的凸最小化问题,无论给定的约束子集是否有界。