Suppr超能文献

改进的 Levenberg-Marquardt 算法在人工神经网络训练中的稳定性分析。

Stability Analysis of the Modified Levenberg-Marquardt Algorithm for the Artificial Neural Network Training.

出版信息

IEEE Trans Neural Netw Learn Syst. 2021 Aug;32(8):3510-3524. doi: 10.1109/TNNLS.2020.3015200. Epub 2021 Aug 3.

Abstract

The Levenberg-Marquardt and Newton are two algorithms that use the Hessian for the artificial neural network learning. In this article, we propose a modified Levenberg-Marquardt algorithm for the artificial neural network learning containing the training and testing stages. The modified Levenberg-Marquardt algorithm is based on the Levenberg-Marquardt and Newton algorithms but with the following two differences to assure the error stability and weights boundedness: 1) there is a singularity point in the learning rates of the Levenberg-Marquardt and Newton algorithms, while there is not a singularity point in the learning rate of the modified Levenberg-Marquardt algorithm and 2) the Levenberg-Marquardt and Newton algorithms have three different learning rates, while the modified Levenberg-Marquardt algorithm only has one learning rate. The error stability and weights boundedness of the modified Levenberg-Marquardt algorithm are assured based on the Lyapunov technique. We compare the artificial neural network learning with the modified Levenberg-Marquardt, Levenberg-Marquardt, Newton, and stable gradient algorithms for the learning of the electric and brain signals data set.

摘要

莱文伯格-马夸尔特和牛顿是两种使用海森矩阵进行人工神经网络学习的算法。在本文中,我们提出了一种改进的莱文伯格-马夸尔特算法,用于包含训练和测试阶段的人工神经网络学习。改进的莱文伯格-马夸尔特算法基于莱文伯格-马夸尔特和牛顿算法,但有两个区别,以确保误差稳定性和权重有界性:1)莱文伯格-马夸尔特和牛顿算法的学习率存在奇点,而改进的莱文伯格-马夸尔特算法的学习率不存在奇点;2)莱文伯格-马夸尔特和牛顿算法有三个不同的学习率,而改进的莱文伯格-马夸尔特算法只有一个学习率。基于李雅普诺夫技术,保证了改进的莱文伯格-马夸尔特算法的误差稳定性和权重有界性。我们将改进的莱文伯格-马夸尔特算法、莱文伯格-马夸尔特算法、牛顿算法和稳定梯度算法用于学习电和脑信号数据集,比较人工神经网络的学习效果。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验