Hao Wenrui, Hong Qingguo, Jin Xianlin
Department of Mathematics, Pennsylvania State University, State College, USA.
Department of Mathematics and Statistics, Missouri University of Science and Technology, Rolla, USA.
J Sci Comput. 2024 Jul;100(1). doi: 10.1007/s10915-024-02535-z. Epub 2024 Jun 3.
The numerical solution of differential equations using machine learning-based approaches has gained significant popularity. Neural network-based discretization has emerged as a powerful tool for solving differential equations by parameterizing a set of functions. Various approaches, such as the deep Ritz method and physics-informed neural networks, have been developed for numerical solutions. Training algorithms, including gradient descent and greedy algorithms, have been proposed to solve the resulting optimization problems. In this paper, we focus on the variational formulation of the problem and propose a Gauss-Newton method for computing the numerical solution. We provide a comprehensive analysis of the superlinear convergence properties of this method, along with a discussion on semi-regular zeros of the vanishing gradient. Numerical examples are presented to demonstrate the efficiency of the proposed Gauss-Newton method.
使用基于机器学习的方法来求解微分方程的数值解已经变得非常流行。基于神经网络的离散化方法已经成为通过对一组函数进行参数化来求解微分方程的强大工具。已经开发了各种方法,如深度里兹法和物理信息神经网络,用于数值解。为了解决由此产生的优化问题,已经提出了包括梯度下降和贪婪算法在内的训练算法。在本文中,我们专注于该问题的变分形式,并提出一种高斯 - 牛顿法来计算数值解。我们对该方法的超线性收敛特性进行了全面分析,并讨论了消失梯度的半正则零点。给出了数值例子以证明所提出的高斯 - 牛顿法的有效性。