Suppr超能文献

Leap-frog is a robust algorithm for training neural networks.

作者信息

Holm J E, Botha E C

机构信息

Department of Electrical and Electronic Engineering, University of Pretoria, South Africa.

出版信息

Network. 1999 Feb;10(1):1-13.

Abstract

Optimization of perceptron neural network classifiers requires an optimization algorithm that is robust. In general, the best network is selected after a number of optimization trials. An effective optimization algorithm generates good weight-vector solutions in a few optimization trial runs owing to its inherent ability to escape local minima, where a less effective algorithm requires a larger number of trial runs. Repetitive training and testing is a tedious process, so that an effective algorithm is desirable to reduce training time and increase the quality of the set of available weight-vector solutions. We present leap-frog as a robust optimization algorithm for training neural networks. In this paper the dynamic principles of leap-frog are described together with experiments to show the ability of leap-frog to generate reliable weight-vector solutions. Performance histograms are used to compare leap-frog with a variable-metric method, a conjugate-gradient method with modified restarts, and a constrained-momentum-based algorithm. Results indicate that leap-frog performs better in terms of classification error than the remaining three algorithms on two distinctly different test problems.

摘要

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验