Suppr超能文献

一种基于拉格朗日插值算子的用于ResNet18的新型黑寡妇优化算法。

A Novel Black Widow Optimization Algorithm Based on Lagrange Interpolation Operator for ResNet18.

作者信息

Wei Peiyang, Hu Can, Hu Jingyi, Li Zhibin, Qin Wen, Gan Jianhong, Chen Tinghui, Shu Hongping, Shang Mingsheng

机构信息

School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065, China.

School of Software Engineering, Chengdu University of Information Technology, Chengdu 610225, China.

出版信息

Biomimetics (Basel). 2025 Jun 3;10(6):361. doi: 10.3390/biomimetics10060361.

Abstract

Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.

摘要

超参数在神经网络中起着关键作用;它们对训练效果和整体模型性能都有显著影响。合适的超参数设置可以加速模型收敛并提高泛化能力。在各种超参数中,学习率尤为重要。然而,优化学习率通常需要大量的实验和调整,因为其设置往往依赖于特定的任务和数据集,因此缺乏通用的规则或标准。因此,通常通过反复试验来进行调整,从而使得学习率的选择既复杂又耗时。为了克服这一挑战,进化计算算法可以自动调整超参数学习率,以提高训练效率和模型性能。针对此,我们提出了一种基于拉格朗日插值的黑寡妇优化算法(LIBWONN)来优化ResNet18的学习率。此外,我们使用CEC2017和CEC2022中的24个基准函数评估了LIBWONN的有效性,并将其与九种先进的元启发式算法进行了比较。实验结果表明,LIBWONN在收敛性和稳定性方面优于其他算法。此外,在六个不同领域的公开可用数据集上进行的实验表明,与标准的黑寡妇优化算法相比,LIBWONN在训练集和测试集上的准确率都有所提高,分别提高了6.99%和4.48%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8c49/12190972/291380826e80/biomimetics-10-00361-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验