Suppr超能文献

用于前馈神经网络训练的确定性全局优化。

Deterministic global optimization for FNN training.

作者信息

Toh K A

机构信息

Institute for Infocomm Research Singapore.

出版信息

IEEE Trans Syst Man Cybern B Cybern. 2003;33(6):977-83. doi: 10.1109/TSMCB.2002.804366.

Abstract

This paper addresses the issue of training feedforward neural networks by global optimization. The main contributions include characterization of global optimality of a network error function, and formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single-output unit is considered. By means of a monotonic transformation, a sufficient condition for global optimality of a network error function is presented. Based on this, a penalty-based algorithm is derived directing the search towards possible regions containing the global minima. Numerical comparison with benchmark problems from the neural network literature shows superiority of the proposed algorithm over some local methods, in terms of the percentage of trials attaining the desired solutions. The algorithm is also shown to be effective for several pattern recognition problems.

摘要

本文探讨了通过全局优化来训练前馈神经网络的问题。主要贡献包括对网络误差函数全局最优性的刻画,以及制定一种全局下降算法来解决网络训练问题。本文考虑了具有单个隐藏层和单个输出单元的网络。通过单调变换,给出了网络误差函数全局最优性的充分条件。在此基础上,推导了一种基于惩罚的算法,将搜索导向可能包含全局最小值的区域。与神经网络文献中的基准问题进行数值比较表明,在所提出算法获得期望解的试验百分比方面,该算法优于一些局部方法。该算法还被证明对几个模式识别问题有效。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验