National University of Singapore, 21 Lower Kent Ridge Road, 119077, Singapore.
Division of Applied Mathematics, Brown University, 182 George Street, Providence, RI 02912, USA.
Neural Netw. 2024 Aug;176:106369. doi: 10.1016/j.neunet.2024.106369. Epub 2024 May 7.
The curse-of-dimensionality taxes computational resources heavily with exponentially increasing computational cost as the dimension increases. This poses great challenges in solving high-dimensional partial differential equations (PDEs), as Richard E. Bellman first pointed out over 60 years ago. While there has been some recent success in solving numerical PDEs in high dimensions, such computations are prohibitively expensive, and true scaling of general nonlinear PDEs to high dimensions has never been achieved. We develop a new method of scaling up physics-informed neural networks (PINNs) to solve arbitrary high-dimensional PDEs. The new method, called Stochastic Dimension Gradient Descent (SDGD), decomposes a gradient of PDEs' and PINNs' residual into pieces corresponding to different dimensions and randomly samples a subset of these dimensional pieces in each iteration of training PINNs. We prove theoretically the convergence and other desired properties of the proposed method. We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schrödinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach. Notably, we solve nonlinear PDEs with nontrivial, anisotropic, and inseparable solutions in less than one hour for 1000 dimensions and in 12 h for 100,000 dimensions on a single GPU using SDGD with PINNs. Since SDGD is a general training methodology of PINNs, it can be applied to any current and future variants of PINNs to scale them up for arbitrary high-dimensional PDEs.
高维问题会给计算资源带来巨大的负担,随着维度的增加,计算成本呈指数级增长。正如 Richard E. Bellman 60 多年前首次指出的那样,这给解决高维偏微分方程(PDE)带来了巨大的挑战。虽然近年来在解决高维数值 PDE 方面取得了一些成功,但这些计算的代价非常高昂,而且从未真正实现过一般非线性 PDE 在高维上的真正扩展。我们开发了一种将物理启发神经网络(PINN)扩展到解决任意高维 PDE 的新方法。这种新方法称为随机维度梯度下降(SDGD),它将 PDE 和 PINN 的残差梯度分解为对应于不同维度的若干部分,并在每次训练 PINN 的迭代中随机采样这些维度部分的子集。我们从理论上证明了所提出方法的收敛性和其他期望的性质。我们在各种不同的测试中证明,所提出的方法可以解决许多众所周知的高维 PDE 问题,包括 Hamilton-Jacobi-Bellman(HJB)和 Schrödinger 方程,在单个 GPU 上使用无网格的 PINNs 方法可以在数万个维度上非常快速地求解,在单个 GPU 上使用 PINNs 的 SDGD 可以在不到 1 小时内求解 1000 维的非线性 PDE,且具有非平凡、各向异性和不可分离的解,在 12 小时内求解 100000 维的非线性 PDE。由于 SDGD 是 PINNs 的一般训练方法,因此它可以应用于任何当前和未来的 PINNs 变体,以将其扩展到任意高维 PDE。