Suppr超能文献

改进的 Levenberg-Marquardt 算法在人工神经网络训练中的稳定性分析。

Stability Analysis of the Modified Levenberg-Marquardt Algorithm for the Artificial Neural Network Training.

出版信息

IEEE Trans Neural Netw Learn Syst. 2021 Aug;32(8):3510-3524. doi: 10.1109/TNNLS.2020.3015200. Epub 2021 Aug 3.

Abstract

The Levenberg-Marquardt and Newton are two algorithms that use the Hessian for the artificial neural network learning. In this article, we propose a modified Levenberg-Marquardt algorithm for the artificial neural network learning containing the training and testing stages. The modified Levenberg-Marquardt algorithm is based on the Levenberg-Marquardt and Newton algorithms but with the following two differences to assure the error stability and weights boundedness: 1) there is a singularity point in the learning rates of the Levenberg-Marquardt and Newton algorithms, while there is not a singularity point in the learning rate of the modified Levenberg-Marquardt algorithm and 2) the Levenberg-Marquardt and Newton algorithms have three different learning rates, while the modified Levenberg-Marquardt algorithm only has one learning rate. The error stability and weights boundedness of the modified Levenberg-Marquardt algorithm are assured based on the Lyapunov technique. We compare the artificial neural network learning with the modified Levenberg-Marquardt, Levenberg-Marquardt, Newton, and stable gradient algorithms for the learning of the electric and brain signals data set.

摘要

莱文伯格-马夸尔特和牛顿是两种使用海森矩阵进行人工神经网络学习的算法。在本文中,我们提出了一种改进的莱文伯格-马夸尔特算法,用于包含训练和测试阶段的人工神经网络学习。改进的莱文伯格-马夸尔特算法基于莱文伯格-马夸尔特和牛顿算法,但有两个区别,以确保误差稳定性和权重有界性:1)莱文伯格-马夸尔特和牛顿算法的学习率存在奇点,而改进的莱文伯格-马夸尔特算法的学习率不存在奇点;2)莱文伯格-马夸尔特和牛顿算法有三个不同的学习率,而改进的莱文伯格-马夸尔特算法只有一个学习率。基于李雅普诺夫技术,保证了改进的莱文伯格-马夸尔特算法的误差稳定性和权重有界性。我们将改进的莱文伯格-马夸尔特算法、莱文伯格-马夸尔特算法、牛顿算法和稳定梯度算法用于学习电和脑信号数据集,比较人工神经网络的学习效果。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验