School of Computer and Information Technology, Northeast Petroleum University, Daqing 163318, China.
Heilongjiang Key Laboratory of Petroleum Big Data and Intelligent Analysis, Daqing 163318, China.
Comput Intell Neurosci. 2023 Jan 10;2023:4765891. doi: 10.1155/2023/4765891. eCollection 2023.
An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent is proposed to address issues of the Adam algorithm such as slow convergence, the tendency to miss the global optimal solution, and the ineffectiveness of processing high-dimensional vectors. The adaptive coefficient is used to adjust the gradient deviation value and correct the search direction firstly. Then, the predicted gradient is introduced, and the current gradient and the first-order momentum are combined to form a composite gradient to improve the global optimization ability. Finally, the random block coordinate method is used to determine the gradient update mode, which reduces the computational overhead. Simulation experiments on two standard datasets for classification show that the convergence speed and accuracy of the proposed algorithm are higher than those of the six gradient descent methods, and the CPU and memory utilization are significantly reduced. In addition, based on logging data, the BP neural networks optimized by six algorithms, respectively, are used to predict reservoir porosity. Results show that the proposed method has lower system overhead, higher accuracy, and stronger stability, and the absolute error of more than 86% data is within 0.1%, which further verifies its effectiveness.
提出了一种基于随机块坐标下降的自适应系数和复合梯度的改进 Adam 优化算法,以解决 Adam 算法收敛速度慢、容易错过全局最优解、处理高维向量效果不佳等问题。该算法首先使用自适应系数来调整梯度偏差值并修正搜索方向。然后,引入预测梯度,将当前梯度与一阶动量相结合形成复合梯度,从而提高全局优化能力。最后,采用随机块坐标方法确定梯度更新模式,降低了计算开销。在两个标准分类数据集上的仿真实验表明,所提算法的收敛速度和准确性均高于六种梯度下降方法,且 CPU 和内存利用率显著降低。此外,基于测井数据,分别使用六种算法优化的 BP 神经网络来预测储层孔隙度。结果表明,所提方法系统开销更低、精度更高、稳定性更强,超过 86%的数据的绝对误差在 0.1%以内,进一步验证了其有效性。