Xue Han, Shao Zheping, Sun Hongbo
Institute of Navigation, Jimei University , Xiamen, China.
Network. 2020 Feb-Nov;31(1-4):166-185. doi: 10.1080/0954898X.2020.1849842. Epub 2020 Dec 6.
The weight-updating methods have played an important role in improving the performance of neural networks. To ameliorate the oscillating phenomenon in training radial basis function (RBF) neural network, a fractional order gradient descent with momentum method for updating the weights of RBF neural network (FOGDM-RBF) is proposed for data classification. Its convergence is proved. In order to speed up the convergence process, an adaptive learning rate is used to adjust the training process. The Iris data set and MNIST data set are used to test the proposed algorithm. The results verify the theoretical results of the proposed algorithm such as its monotonicity and convergence. Some non-parametric statistical tests such as Friedman test and Quade test are taken for the comparison of the proposed algorithm with other algorithms. The influence of fractional order, learning rate and batch size is analysed and compared. Error analysis shows that the algorithm can effectively accelerate the convergence speed of gradient descent method and improve its performance with high accuracy and validity.
权重更新方法在提高神经网络性能方面发挥了重要作用。为了改善径向基函数(RBF)神经网络训练中的振荡现象,提出了一种用于更新RBF神经网络权重的带动量的分数阶梯度下降法(FOGDM-RBF)用于数据分类。证明了其收敛性。为了加快收敛过程,使用自适应学习率来调整训练过程。使用鸢尾花数据集和MNIST数据集对所提算法进行测试。结果验证了所提算法的单调性和收敛性等理论结果。采用一些非参数统计检验,如Friedman检验和Quade检验,将所提算法与其他算法进行比较。分析并比较了分数阶、学习率和批量大小的影响。误差分析表明,该算法能有效加快梯度下降法的收敛速度,提高其性能,具有高精度和有效性。