• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于高效求解偏微分方程问题的基于多任务深度算子网络的协同学习。

Synergistic learning with multi-task DeepONet for efficient PDE problem solving.

作者信息

Kumar Varun, Goswami Somdatta, Kontolati Katiana, Shields Michael D, Karniadakis George Em

机构信息

School of Engineering, Brown University, United States of America.

Department of Civil and Systems Engineering, Johns Hopkins University, United States of America.

出版信息

Neural Netw. 2025 Apr;184:107113. doi: 10.1016/j.neunet.2024.107113. Epub 2025 Jan 3.

DOI:10.1016/j.neunet.2024.107113
PMID:39793491
Abstract

Multi-task learning (MTL) is an inductive transfer mechanism designed to leverage useful information from multiple tasks to improve generalization performance compared to single-task learning. It has been extensively explored in traditional machine learning to address issues such as data sparsity and overfitting in neural networks. In this work, we apply MTL to problems in science and engineering governed by partial differential equations (PDEs). However, implementing MTL in this context is complex, as it requires task-specific modifications to accommodate various scenarios representing different physical processes. To this end, we present a multi-task deep operator network (MT-DeepONet) to learn solutions across various functional forms of source terms in a PDE and multiple geometries in a single concurrent training session. We introduce modifications in the branch network of the vanilla DeepONet to account for various functional forms of a parameterized coefficient in a PDE. Additionally, we handle parameterized geometries by introducing a binary mask in the branch network and incorporating it into the loss term to improve convergence and generalization to new geometry tasks. Our approach is demonstrated on three benchmark problems: (1) learning different functional forms of the source term in the Fisher equation; (2) learning multiple geometries in a 2D Darcy Flow problem and showcasing better transfer learning capabilities to new geometries; and (3) learning 3D parameterized geometries for a heat transfer problem and demonstrate the ability to predict on new but similar geometries. Our MT-DeepONet framework offers a novel approach to solving PDE problems in engineering and science under a unified umbrella based on synergistic learning that reduces the overall training cost for neural operators.

摘要

多任务学习(MTL)是一种归纳迁移机制,旨在利用来自多个任务的有用信息,以提高与单任务学习相比的泛化性能。它在传统机器学习中已被广泛探索,以解决诸如神经网络中的数据稀疏性和过拟合等问题。在这项工作中,我们将多任务学习应用于由偏微分方程(PDE)支配的科学和工程问题。然而,在这种情况下实现多任务学习很复杂,因为它需要针对特定任务进行修改,以适应代表不同物理过程的各种场景。为此,我们提出了一种多任务深度算子网络(MT-DeepONet),以便在单个并发训练会话中学习偏微分方程中各种源项函数形式和多种几何形状的解。我们对普通深度算子网络的分支网络进行了修改,以考虑偏微分方程中参数化系数的各种函数形式。此外,我们通过在分支网络中引入二进制掩码并将其纳入损失项来处理参数化几何形状,以提高收敛性并推广到新的几何形状任务。我们的方法在三个基准问题上得到了验证:(1)学习费希尔方程中源项的不同函数形式;(2)学习二维达西流问题中的多种几何形状,并展示对新几何形状更好的迁移学习能力;(3)学习三维参数化几何形状的热传导问题,并展示对新的但相似几何形状进行预测的能力。我们的MT-DeepONet框架提供了一种基于协同学习的统一方法来解决工程和科学中的偏微分方程问题,从而降低了神经算子的总体训练成本。

相似文献

1
Synergistic learning with multi-task DeepONet for efficient PDE problem solving.用于高效求解偏微分方程问题的基于多任务深度算子网络的协同学习。
Neural Netw. 2025 Apr;184:107113. doi: 10.1016/j.neunet.2024.107113. Epub 2025 Jan 3.
2
U-DeepONet: U-Net enhanced deep operator network for geologic carbon sequestration.U-DeepONet:用于地质碳封存的U-Net增强深度算子网络
Sci Rep. 2024 Sep 12;14(1):21298. doi: 10.1038/s41598-024-72393-0.
3
A scalable framework for learning the geometry-dependent solution operators of partial differential equations.一种用于学习偏微分方程的几何相关解算子的可扩展框架。
Nat Comput Sci. 2024 Dec;4(12):928-940. doi: 10.1038/s43588-024-00732-2. Epub 2024 Dec 9.
4
Kolmogorov n-widths for multitask physics-informed machine learning (PIML) methods: Towards robust metrics.多任务物理信息机器学习(PIML)方法的 Kolmogorov n-宽度:走向稳健的指标。
Neural Netw. 2024 Dec;180:106703. doi: 10.1016/j.neunet.2024.106703. Epub 2024 Sep 4.
5
Learning Only on Boundaries: A Physics-Informed Neural Operator for Solving Parametric Partial Differential Equations in Complex Geometries.仅在边界上学习:一种用于求解复杂几何中参数偏微分方程的物理信息神经算子
Neural Comput. 2024 Feb 16;36(3):475-498. doi: 10.1162/neco_a_01647.
6
Machine-learning-based spectral methods for partial differential equations.基于机器学习的偏微分方程谱方法。
Sci Rep. 2023 Jan 31;13(1):1739. doi: 10.1038/s41598-022-26602-3.
7
PDE-LEARN: Using deep learning to discover partial differential equations from noisy, limited data.PDE-LEARN:利用深度学习从噪声多、数据有限的情况中发现偏微分方程。
Neural Netw. 2024 Jun;174:106242. doi: 10.1016/j.neunet.2024.106242. Epub 2024 Mar 16.
8
Strategies for multi-case physics-informed neural networks for tube flows: a study using 2D flow scenarios.用于管流的多案例物理信息神经网络策略:一项使用二维流场情景的研究
Sci Rep. 2024 May 21;14(1):11577. doi: 10.1038/s41598-024-62117-9.
9
GeneralizedDTA: combining pre-training and multi-task learning to predict drug-target binding affinity for unknown drug discovery.通用 DTA:结合预训练和多任务学习,预测未知药物发现的药物-靶标结合亲和力。
BMC Bioinformatics. 2022 Sep 7;23(1):367. doi: 10.1186/s12859-022-04905-6.
10
MetaNO: How to Transfer Your Knowledge on Learning Hidden Physics.元知识:如何传授你关于学习隐藏物理学的知识。
Comput Methods Appl Mech Eng. 2023 Dec 15;417(Pt B). doi: 10.1016/j.cma.2023.116280. Epub 2023 Jul 28.