• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用物理信息神经网络解决维度诅咒。

Tackling the curse of dimensionality with physics-informed neural networks.

机构信息

National University of Singapore, 21 Lower Kent Ridge Road, 119077, Singapore.

Division of Applied Mathematics, Brown University, 182 George Street, Providence, RI 02912, USA.

出版信息

Neural Netw. 2024 Aug;176:106369. doi: 10.1016/j.neunet.2024.106369. Epub 2024 May 7.

DOI:10.1016/j.neunet.2024.106369
PMID:38754287
Abstract

The curse-of-dimensionality taxes computational resources heavily with exponentially increasing computational cost as the dimension increases. This poses great challenges in solving high-dimensional partial differential equations (PDEs), as Richard E. Bellman first pointed out over 60 years ago. While there has been some recent success in solving numerical PDEs in high dimensions, such computations are prohibitively expensive, and true scaling of general nonlinear PDEs to high dimensions has never been achieved. We develop a new method of scaling up physics-informed neural networks (PINNs) to solve arbitrary high-dimensional PDEs. The new method, called Stochastic Dimension Gradient Descent (SDGD), decomposes a gradient of PDEs' and PINNs' residual into pieces corresponding to different dimensions and randomly samples a subset of these dimensional pieces in each iteration of training PINNs. We prove theoretically the convergence and other desired properties of the proposed method. We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schrödinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach. Notably, we solve nonlinear PDEs with nontrivial, anisotropic, and inseparable solutions in less than one hour for 1000 dimensions and in 12 h for 100,000 dimensions on a single GPU using SDGD with PINNs. Since SDGD is a general training methodology of PINNs, it can be applied to any current and future variants of PINNs to scale them up for arbitrary high-dimensional PDEs.

摘要

高维问题会给计算资源带来巨大的负担,随着维度的增加,计算成本呈指数级增长。正如 Richard E. Bellman 60 多年前首次指出的那样,这给解决高维偏微分方程(PDE)带来了巨大的挑战。虽然近年来在解决高维数值 PDE 方面取得了一些成功,但这些计算的代价非常高昂,而且从未真正实现过一般非线性 PDE 在高维上的真正扩展。我们开发了一种将物理启发神经网络(PINN)扩展到解决任意高维 PDE 的新方法。这种新方法称为随机维度梯度下降(SDGD),它将 PDE 和 PINN 的残差梯度分解为对应于不同维度的若干部分,并在每次训练 PINN 的迭代中随机采样这些维度部分的子集。我们从理论上证明了所提出方法的收敛性和其他期望的性质。我们在各种不同的测试中证明,所提出的方法可以解决许多众所周知的高维 PDE 问题,包括 Hamilton-Jacobi-Bellman(HJB)和 Schrödinger 方程,在单个 GPU 上使用无网格的 PINNs 方法可以在数万个维度上非常快速地求解,在单个 GPU 上使用 PINNs 的 SDGD 可以在不到 1 小时内求解 1000 维的非线性 PDE,且具有非平凡、各向异性和不可分离的解,在 12 小时内求解 100000 维的非线性 PDE。由于 SDGD 是 PINNs 的一般训练方法,因此它可以应用于任何当前和未来的 PINNs 变体,以将其扩展到任意高维 PDE。

相似文献

1
Tackling the curse of dimensionality with physics-informed neural networks.用物理信息神经网络解决维度诅咒。
Neural Netw. 2024 Aug;176:106369. doi: 10.1016/j.neunet.2024.106369. Epub 2024 May 7.
2
Solving high-dimensional partial differential equations using deep learning.使用深度学习解决高维偏微分方程。
Proc Natl Acad Sci U S A. 2018 Aug 21;115(34):8505-8510. doi: 10.1073/pnas.1718942115. Epub 2018 Aug 6.
3
A Combination of Deep Neural Networks and Physics to Solve the Inverse Problem of Burger's Equation.基于深度神经网络与物理模型的 Burger 方程反问题求解
Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:4465-4468. doi: 10.1109/EMBC46164.2021.9630259.
4
Physics-informed kernel function neural networks for solving partial differential equations.基于物理信息核函数神经网络求解偏微分方程。
Neural Netw. 2024 Apr;172:106098. doi: 10.1016/j.neunet.2024.106098. Epub 2024 Jan 2.
5
Physics-informed attention-based neural network for hyperbolic partial differential equations: application to the Buckley-Leverett problem.基于物理信息注意的双曲型偏微分方程神经网络:在 Buckley-Leverett 问题中的应用。
Sci Rep. 2022 May 9;12(1):7557. doi: 10.1038/s41598-022-11058-2.
6
Error estimates and physics informed augmentation of neural networks for thermally coupled incompressible Navier Stokes equations.用于热耦合不可压缩纳维-斯托克斯方程的神经网络的误差估计与物理信息增强
Comput Mech. 2023 Aug;72(2):267-289. doi: 10.1007/s00466-023-02334-7. Epub 2023 Jun 14.
7
The New Simulation of Quasiperiodic Wave, Periodic Wave, and Soliton Solutions of the KdV-mKdV Equation via a Deep Learning Method.基于深度学习方法的 KdV-mKdV 方程拟周期波、周期波和孤子解的新模拟。
Comput Intell Neurosci. 2021 Nov 26;2021:8548482. doi: 10.1155/2021/8548482. eCollection 2021.
8
A Second-Order Network Structure Based on Gradient-Enhanced Physics-Informed Neural Networks for Solving Parabolic Partial Differential Equations.一种基于梯度增强物理信息神经网络的二阶网络结构,用于求解抛物型偏微分方程。
Entropy (Basel). 2023 Apr 18;25(4):674. doi: 10.3390/e25040674.
9
Can physics-informed neural networks beat the finite element method?基于物理信息的神经网络能否击败有限元方法?
IMA J Appl Math. 2024 May 23;89(1):143-174. doi: 10.1093/imamat/hxae011. eCollection 2024 Jan.
10
Solving the non-local Fokker-Planck equations by deep learning.通过深度学习解决非局部福克-普朗克方程。
Chaos. 2023 Apr 1;33(4). doi: 10.1063/5.0128935.

引用本文的文献

1
Physics-informed neural networks for physiological signal processing and modeling: a narrative review.用于生理信号处理与建模的物理信息神经网络:综述
Physiol Meas. 2025 Jul 30;46(7):07TR02. doi: 10.1088/1361-6579/adf1d3.
2
On learning what to learn: Heterogeneous observations of dynamics and establishing possibly causal relations among them.关于学习学习内容:对动态的异质性观察以及在它们之间建立可能的因果关系。
PNAS Nexus. 2024 Dec 6;3(12):pgae494. doi: 10.1093/pnasnexus/pgae494. eCollection 2024 Dec.
3
Predicting adolescent psychopathology from early life factors: A machine learning tutorial.
从早期生活因素预测青少年精神病理学:机器学习教程
Glob Epidemiol. 2024 Aug 29;8:100161. doi: 10.1016/j.gloepi.2024.100161. eCollection 2024 Dec.
4
Can physics-informed neural networks beat the finite element method?基于物理信息的神经网络能否击败有限元方法?
IMA J Appl Math. 2024 May 23;89(1):143-174. doi: 10.1093/imamat/hxae011. eCollection 2024 Jan.