This paper provides a comprehensive technical introduction to variational inference (VI) for solving forward and inverse problems in physics. It covers the key concepts and mathematical formulations required to understand and apply VI in this context.
The paper first introduces the fundamentals of physical modelling using partial differential equations (PDEs) and discusses both forward and inverse problems. It then presents a detailed derivation of VI, highlighting its advantages over traditional MCMC sampling methods for problems involving nonlinear models.
The authors then review several salient methods from the literature that demonstrate the versatility of VI for physics-informed deep generative modelling. These methods can be broadly categorized into two approaches:
Forward-model-based learning: These methods embed the known physical forward model into the probabilistic generative model, allowing for the learning of calibrated posterior estimates over the model parameters.
Residual-based learning: These methods leverage the physics of the problem by incorporating the residual of the PDE directly into the VI objective, enabling the construction of efficient surrogate models that can quantify uncertainty.
The reviewed techniques showcase the flexibility of VI in handling a wide range of physics-based problems, from uncertainty quantification and inverse problems to the learning of generative models that respect the underlying physical constraints.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Alex Glyn-Da... at arxiv.org 09-11-2024
https://arxiv.org/pdf/2409.06560.pdfDeeper Inquiries