Offen Christian, Ober-Blöbaum Sina
Department of Mathematics, Paderborn University, Warburger Str. 100, 33098 Paderborn, Germany.
Chaos. 2024 Jan 1;34(1). doi: 10.1063/5.0172287.
We show how to learn discrete field theories from observational data of fields on a space-time lattice. For this, we train a neural network model of a discrete Lagrangian density such that the discrete Euler-Lagrange equations are consistent with the given training data. We, thus, obtain a structure-preserving machine learning architecture. Lagrangian densities are not uniquely defined by the solutions of a field theory. We introduce a technique to derive regularizers for the training process which optimize numerical regularity of the discrete field theory. Minimization of the regularizers guarantees that close to the training data the discrete field theory behaves robust and efficient when used in numerical simulations. Further, we show how to identify structurally simple solutions of the underlying continuous field theory such as traveling waves. This is possible even when traveling waves are not present in the training data. This is compared to data-driven model order reduction based approaches, which struggle to identify suitable latent spaces containing structurally simple solutions when these are not present in the training data. Ideas are demonstrated on examples based on the wave equation and the Schrödinger equation.
我们展示了如何从时空晶格上的场的观测数据中学习离散场论。为此,我们训练一个离散拉格朗日密度的神经网络模型,使得离散欧拉 - 拉格朗日方程与给定的训练数据一致。因此,我们获得了一种保持结构的机器学习架构。拉格朗日密度并非由场论的解唯一确定。我们引入一种技术来推导训练过程的正则化项,以优化离散场论的数值正则性。正则化项的最小化保证了在接近训练数据时,离散场论在数值模拟中表现得稳健且高效。此外,我们展示了如何识别基础连续场论的结构简单的解,如行波。即使训练数据中不存在行波,这也是可能的。这与基于数据驱动的模型降阶方法进行了比较,当训练数据中不存在结构简单的解时,这些方法难以识别包含此类解的合适潜在空间。基于波动方程和薛定谔方程的示例展示了这些想法。