Core Concepts
The authors present a structure-preserving Eulerian algorithm for solving L2-gradient flows and a structure-preserving Lagrangian algorithm for solving generalized diffusions, both of which employ neural networks as tools for spatial discretization. The proposed schemes are constructed based on the energy-dissipation law directly, guaranteeing the monotonic decay of the system's free energy and ensuring long-term stability of numerical computations.
Abstract
The paper introduces a novel numerical framework, called Energetic Variational Neural Network (EVNN), for solving gradient flows and generalized diffusions. The key ideas are:
Eulerian EVNN for L2-gradient flows:
Constructs a finite-dimensional approximation to the continuous energy-dissipation law by introducing a neural network-based spatial discretization.
Performs temporal discretization before spatial discretization to overcome challenges arising from nonlinear neural network discretization.
Formulates the update of the neural network parameters as a minimizing movement scheme that guarantees the monotonic decay of the discrete free energy.
Lagrangian EVNN for generalized diffusions:
Views the generalized diffusion as an L2-gradient flow of the flow map in the space of diffeomorphisms.
Seeks an optimal flow map between consecutive time steps, rather than the full flow map, to improve computational efficiency.
Parameterizes the flow map using neural networks, specifically the convex potential flow architecture, to ensure the map is a diffeomorphism.
The proposed EVNN methods are mesh-free and can solve high-dimensional gradient flows. Numerical experiments demonstrate the accuracy and energy stability of the EVNN schemes.