Learning Stable and Passive Neural Differential Equations with Guaranteed Convergence Properties
This paper introduces a novel class of neural differential equations that are intrinsically Lyapunov stable, exponentially stable, or passive. The proposed models have a Hamiltonian structure with guaranteed quadratic bounds on the Hamiltonian function, enabling stable and passive dynamics.