Suppr超能文献

基于稀疏网络λ-傅里叶神经算子的偏微分方程神经算子迁移学习

Transfer learning of neural operators for partial differential equations based on sparse network λ-FNO.

作者信息

Xu Jinghong, Zhou Yuqian, Liu Qian, Li Kebing, Yang Haolin

机构信息

College of Applied Mathematics, Chengdu University of Information Technology, Chengdu, Sichuan, P. R. China.

College of Mathematics, Southwest Minzu University, Chengdu, Sichuan, P. R. China.

出版信息

PLoS One. 2025 May 22;20(5):e0321154. doi: 10.1371/journal.pone.0321154. eCollection 2025.

Abstract

When the solution domain, internal parameters, and initial and boundary conditions of partial differential equation (PDE) are changed, many potential characteristics of the equation's solutions are still similar. This provides the possibility to reduce the cost of PDE operator learning through transfer learning methods. Based on Fourier neural operator (FNO), we propose a novel sparse neural operator network named λ-FNO. By introducing the λ parameter matrix and using a new pruning method to make the network sparse, the operator learning ability of λ-FNO is greatly improved. Using λ-FNO can efficiently learn the operator from the discrete initial function space on the uniform grid to the discrete equation's solution space on the unstructured grid, which is not available in FNO. Finally, we apply λ-FNO to several specific transfer tasks of partial differential equations under conditional distributions to demonstrate its excellent transferability. The experimental results show that when the shape of the solution domain of the equation or its internal parameters change, our framework can capture the potential invariant information of its solution and complete related transfer learning tasks with less cost, higher accuracy, and faster speed. In addition, the sparse framework has excellent extension and can be easily extended to other network architectures to enhance its performance. Our model and data generation code can get through https://github.com/Xumouren12/TL-FNO.

摘要

当偏微分方程(PDE)的求解域、内部参数以及初始条件和边界条件发生变化时,该方程解的许多潜在特征仍然相似。这为通过迁移学习方法降低偏微分方程算子学习成本提供了可能性。基于傅里叶神经算子(FNO),我们提出了一种名为λ-FNO的新型稀疏神经算子网络。通过引入λ参数矩阵并使用新的剪枝方法使网络稀疏化,λ-FNO的算子学习能力得到了极大提升。使用λ-FNO能够有效地从均匀网格上的离散初始函数空间学习到非结构化网格上离散方程的解空间中的算子,这是FNO所无法做到的。最后,我们将λ-FNO应用于条件分布下偏微分方程的几个特定迁移任务,以证明其出色的迁移能力。实验结果表明,当方程的求解域形状或其内部参数发生变化时,我们的框架能够捕捉其解的潜在不变信息,并以更低的成本、更高的精度和更快的速度完成相关的迁移学习任务。此外,该稀疏框架具有出色的扩展性,能够轻松扩展到其他网络架构以提升其性能。我们的模型和数据生成代码可通过https://github.com/Xumouren12/TL-FNO获取。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bc90/12097631/f90f26f4a1b7/pone.0321154.g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验