Suppr超能文献

一种用于物理信息深度学习的外推驱动网络架构。

An extrapolation-driven network architecture for physics-informed deep learning.

作者信息

Wang Yong, Yao Yanzhong, Gao Zhiming

机构信息

Institute of Applied Physics and Computational Mathematics, Beijing 100088, China; Graduate School of China Academy of Engineering Physics, Beijing 100088, China; National Key Laboratory of Computational Physics, Beijing 100088, China.

Institute of Applied Physics and Computational Mathematics, Beijing 100088, China; National Key Laboratory of Computational Physics, Beijing 100088, China.

出版信息

Neural Netw. 2025 Mar;183:106998. doi: 10.1016/j.neunet.2024.106998. Epub 2024 Dec 5.

Abstract

Current physics-informed neural network (PINN) implementations with sequential learning strategies often experience some weaknesses, such as the failure to reproduce the previous training results when using a single network, the difficulty to strictly ensure continuity and smoothness at the time interval nodes when using multiple networks, and the increase in complexity and computational overhead. To overcome these shortcomings, we first investigate the extrapolation capability of the PINN method for time-dependent PDEs. Taking advantage of this extrapolation property, we generalize the training result obtained in a specific time subinterval to larger intervals by adding a correction term to the network parameters of the subinterval. The correction term is determined by further training with the sample points in the added subinterval. Secondly, by designing an extrapolation control function with special characteristics and combining it with a correction term, we construct a new neural network architecture whose network parameters are coupled with the time variable, which we call the extrapolation-driven network architecture. Based on this architecture, using a single neural network, we can obtain the overall PINN solution of the whole domain with the following two characteristics: (1) it completely inherits the local solution of the interval obtained from the previous training, (2) at the interval node, it strictly maintains the continuity and smoothness that the true solution has. The extrapolation-driven network architecture allows us to divide a large time domain into multiple subintervals and solve the time-dependent PDEs one by one in a chronological order. This training scheme respects the causality principle and effectively overcomes the difficulties of the conventional PINN method in solving the evolution equation on a large time domain. Numerical experiments verify the performance of our method. The data and code accompanying this paper are available at https://github.com/wangyong1301108/E-DNN.

摘要

当前采用顺序学习策略的物理信息神经网络(PINN)实现方式往往存在一些弱点,例如使用单个网络时无法重现先前的训练结果,使用多个网络时难以在时间间隔节点严格确保连续性和平滑性,以及复杂度和计算开销的增加。为了克服这些缺点,我们首先研究PINN方法对时间相关偏微分方程的外推能力。利用这种外推特性,我们通过向子区间的网络参数添加校正项,将在特定时间子区间获得的训练结果推广到更大的区间。校正项通过在添加的子区间内使用采样点进行进一步训练来确定。其次,通过设计具有特殊特性的外推控制函数并将其与校正项相结合,我们构建了一种新的神经网络架构,其网络参数与时间变量耦合,我们称之为外推驱动网络架构。基于此架构,使用单个神经网络,我们可以获得整个域的PINN整体解,具有以下两个特点:(1)它完全继承了先前训练获得的区间局部解,(2)在区间节点处,它严格保持真实解所具有的连续性和平滑性。外推驱动网络架构使我们能够将大的时间域划分为多个子区间,并按时间顺序逐个求解时间相关的偏微分方程。这种训练方案遵循因果原理,有效克服了传统PINN方法在求解大时间域上的演化方程时的困难。数值实验验证了我们方法的性能。本文附带的数据和代码可在https://github.com/wangyong1301108/E-DNN获取。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验