College of Computer Science, Sichuan University, Chengdu 610065, China.
Comput Intell Neurosci. 2018 Jul 3;2018:7361628. doi: 10.1155/2018/7361628. eCollection 2018.
In recent years, the research of artificial neural networks based on fractional calculus has attracted much attention. In this paper, we proposed a fractional-order deep backpropagation (BP) neural network model with regularization. The proposed network was optimized by the fractional gradient descent method with Caputo derivative. We also illustrated the necessary conditions for the convergence of the proposed network. The influence of regularization on the convergence was analyzed with the fractional-order variational method. The experiments have been performed on the MNIST dataset to demonstrate that the proposed network was deterministically convergent and can effectively avoid overfitting.
近年来,基于分数阶微积分的人工神经网络研究引起了广泛关注。本文提出了一种具有正则化的分数阶深度反向传播(BP)神经网络模型。所提出的网络通过具有 Caputo 导数的分数阶梯度下降方法进行优化。我们还说明了所提出网络的收敛的必要条件。通过分数阶变分法分析了正则化对收敛的影响。在 MNIST 数据集上进行了实验,以证明所提出的网络是确定性收敛的,并且可以有效地避免过拟合。