• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Improved fractional-order gradient descent method based on multilayer perceptron.

作者信息

Zhou Xiaojun, Zhao Chunna, Huang Yaqun, Zhou Chengli, Ye Junjie

机构信息

School of Information Science and Engineering, Yunnan University, Kunming, 650091, China.

出版信息

Neural Netw. 2025 Mar;183:106970. doi: 10.1016/j.neunet.2024.106970. Epub 2024 Dec 1.

DOI:10.1016/j.neunet.2024.106970
PMID:39642645
Abstract

The fractional-order gradient descent (FOGD) method has been employed by numerous scholars in Artificial Neural Networks (ANN), with its superior performance validated both theoretically and experimentally. However, current FOGD methods only apply fractional-order differentiation to the loss function. The application of FOGD based on Autograd to hidden layers leverages the characteristics of fractional-order differentiation, significantly enhancing its flexibility. Moreover, the implementation of FOGD in the hidden layers serves as a necessary foundation for establishing a family of fractional-order deep learning optimizers, facilitating the widespread application of FOGD in deep learning. This paper proposes an improved fractional-order gradient descent (IFOGD) method based on Multilayer Perceptron (MLP). Firstly, a fractional matrix differentiation algorithm and its fractional matrix differentiation solver is proposed based on MLP, ensuring that IFOGD can be applied within the hidden layers. Subsequently, we overcome the issue of incorrect backpropagation direction caused by the absolute value symbol, ensuring that the IFOGD method does not cause divergence in the value of the loss function. Thirdly, fractional-order Autograd (FOAutograd) is proposed based on PyTorch by reconstructing Linear layer and Mean Squared Error Loss module. By combining FOAutograd with first-order adaptive deep learning optimizers, parameter matrices in each layer of ANN can be updated using fractional-order gradients. Finally, we compare and analyze the performance of IFOGD with other methods in simulation experiments and time series prediction tasks. The experimental results demonstrate that the IFOGD method exhibits performances.

摘要

相似文献

1
Improved fractional-order gradient descent method based on multilayer perceptron.
Neural Netw. 2025 Mar;183:106970. doi: 10.1016/j.neunet.2024.106970. Epub 2024 Dec 1.
2
A fractional gradient descent algorithm robust to the initial weights of multilayer perceptron.一种对多层感知器初始权重具有鲁棒性的分数梯度下降算法。
Neural Netw. 2023 Jan;158:154-170. doi: 10.1016/j.neunet.2022.11.018. Epub 2022 Nov 17.
3
Deep convolutional neural network and IoT technology for healthcare.用于医疗保健的深度卷积神经网络和物联网技术。
Digit Health. 2024 Jan 17;10:20552076231220123. doi: 10.1177/20552076231220123. eCollection 2024 Jan-Dec.
4
Fractional-order stochastic gradient descent method with momentum and energy for deep neural networks.用于深度神经网络的带动量和能量的分数阶随机梯度下降法
Neural Netw. 2025 Jan;181:106810. doi: 10.1016/j.neunet.2024.106810. Epub 2024 Oct 19.
5
Novel maximum-margin training algorithms for supervised neural networks.用于监督神经网络的新型最大间隔训练算法。
IEEE Trans Neural Netw. 2010 Jun;21(6):972-84. doi: 10.1109/TNN.2010.2046423. Epub 2010 Apr 19.
6
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
7
Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.优化神经网络在医学数据集上的应用:以新生儿呼吸暂停预测为例的研究
Artif Intell Med. 2019 Jul;98:59-76. doi: 10.1016/j.artmed.2019.07.008. Epub 2019 Jul 25.
8
Data classification based on fractional order gradient descent with momentum for RBF neural network.基于分数阶梯度下降和动量的径向基函数神经网络数据分类
Network. 2020 Feb-Nov;31(1-4):166-185. doi: 10.1080/0954898X.2020.1849842. Epub 2020 Dec 6.
9
An artificial neural network to model response of a radiotherapy beam monitoring system.一种用于模拟放射治疗束监测系统响应的人工神经网络。
Med Phys. 2020 Apr;47(4):1983-1994. doi: 10.1002/mp.14033. Epub 2020 Feb 3.
10
GGA-MLP: A Greedy Genetic Algorithm to Optimize Weights and Biases in Multilayer Perceptron.GGA-MLP:一种在多层感知器中优化权重和偏差的贪婪遗传算法。
Contrast Media Mol Imaging. 2022 Feb 24;2022:4036035. doi: 10.1155/2022/4036035. eCollection 2022.