The article presents a methodology for learning stable differential operators from data. It first discusses the standard regression-based approach for learning differential operators from data, which does not guarantee stability. The article then proposes a novel constrained regression-based approach that incorporates stability constraints derived from linear stability theory to learn stable differential operators.
The key highlights and insights are:
The standard regression-based approach can learn differential operators that are accurate within the training dataset but may not be stable, leading to unstable dynamics outside the training dataset.
The proposed constrained regression-based approach ensures that the learned differential operators are linearly stable by incorporating constraints derived from linear stability theory. This is done by formulating the regression problem with additional constraints to ensure the learned operators have non-positive eigenvalues.
The constrained regression problem is solved using a sequential least squares programming optimizer to obtain the stable learned differential operators (S-LDOs).
The approach is extended to learn stable differential operators for nonlinear partial differential equations by deriving stability constraints from the linearized equations around an equilibrium point.
The applicability of the proposed approach is demonstrated on three test cases: 1-D scalar advection-diffusion equation, 1-D Burgers equation, and 2-D advection equation. The results show that the S-LDOs provide accurate and stable solutions, outperforming the standard learned differential operators (LDOs).
The increase in stencil size for S-LDOs improves the accuracy while reducing the stiffness of the system, without compromising the stability guarantee.
To Another Language
from source content
arxiv.org
ข้อมูลเชิงลึกที่สำคัญจาก
by Aviral Praka... ที่ arxiv.org 05-02-2024
https://arxiv.org/pdf/2405.00198.pdfสอบถามเพิ่มเติม