Wei Peiyang, Zou Rundong, Gan Jianhong, Li Zhibin
School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065, China.
School of Software Engineering, Chengdu University of Information Technology, Chengdu 610225, China.
Biomimetics (Basel). 2025 Aug 19;10(8):544. doi: 10.3390/biomimetics10080544.
Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.
卷积神经网络(CNN)及其改进模型(如DenseNet-121)在图像分类任务中取得了显著成果。然而,这些模型的性能仍受到超参数优化以及梯度消失和梯度爆炸等问题的限制。由于其独特的探索和利用能力,进化算法为解决这些问题提供了新途径。同时,为防止这些算法在搜索过程中陷入局部最优,本研究设计了一种新颖的插值算法。为实现更好的图像分类性能,从而提高分类准确率并增强模型稳定性,本文采用基于带二次插值的角蜥算法和带牛顿插值的大犰狳优化算法的混合算法(HGAO)来优化DenseNet-121的超参数。该算法应用于五个不同领域的数据集。学习率和随机失活率对DenseNet-121模型的结果有显著影响,将其作为要优化的超参数。使用HGAO算法在五个图像数据集上进行实验,并与九种先进算法进行比较。基于准确率、精确率、召回率和F1分数指标对模型性能进行评估。实验结果表明,经HGAO算法优化后,超参数组合变得更加合理,从而带来了关键的改进。在对比实验中,训练集上图像分类的准确率提高了高达0.5%,损失最大减少了0.018。在测试集上,准确率提高了0.5%,损失减少了54个点。HGAO算法为优化DenseNet-121模型提供了一种有效解决方案。所设计的方法提高了分类准确率和模型稳定性,还显著增强了超参数优化效果并解决了梯度难题。