Suppr超能文献

使用弹性传播训练同步递归神经网络进行静态优化。

Training simultaneous recurrent neural network with resilient propagation for static optimization.

作者信息

Serpen Gursel, Corra Joel

机构信息

Electrical Engineering and Computer Science, The University of Toledo, Toledo, OH 43606, USA.

出版信息

Int J Neural Syst. 2002 Jun-Aug;12(3-4):203-18. doi: 10.1142/S0129065702001199.

Abstract

This paper proposes a non-recurrent training algorithm, resilient propagation, for the Simultaneous Recurrent Neural network operating in relaxation-mode for computing high quality solutions of static optimization problems. Implementation details related to adaptation of the recurrent neural network weights through the non-recurrent training algorithm, resilient backpropagation, are formulated through an algebraic approach. Performance of the proposed neuro-optimizer on a well-known static combinatorial optimization problem, the Traveling Salesman Problem, is evaluated on the basis of computational complexity measures and, subsequently, compared to performance of the Simultaneous Recurrent Neural network trained with the standard backpropagation, and recurrent backpropagation for the same static optimization problem. Simulation results indicate that the Simultaneous Recurrent Neural network trained with the resilient backpropagation algorithm is able to locate superior quality solutions through comparable amount of computational effort for the Traveling Salesman Problem.

摘要

本文提出了一种非循环训练算法——弹性传播算法,用于在松弛模式下运行的同步递归神经网络,以计算静态优化问题的高质量解。通过代数方法阐述了与通过非循环训练算法(弹性反向传播算法)调整递归神经网络权重相关的实现细节。基于计算复杂度度量,评估了所提出的神经优化器在著名的静态组合优化问题——旅行商问题上的性能,随后将其与针对相同静态优化问题使用标准反向传播算法和递归反向传播算法训练的同步递归神经网络的性能进行了比较。仿真结果表明,使用弹性反向传播算法训练的同步递归神经网络能够通过相当的计算量为旅行商问题找到质量更高的解。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验