Core Concepts
Leveraging machine learning for stable linear subspace identification with SIMBa.
Abstract
Introduction:
Historically, machine learning (ML) and linear system identification (SI) developed independently.
SIMBa introduces a family of linear multi-step-ahead state-space SI methods using backpropagation.
ML Tools for SI:
ML tools like neural networks (NNs) have been used for nonlinear system identification.
NNs can struggle with linear systems where traditional methods might perform better.
Importance of Linear Models:
Linear Time Invariant (LTI) models are crucial for many applications.
Linear models are used in industrial applications for simulations, analysis, and controller design.
SIMBa Framework:
SIMBa leverages backpropagation and unconstrained gradient descent for stable linear SI.
It provides a ready-to-use open-source Python implementation with GPU integration.
Performance Analysis:
SIMBa generally outperforms traditional linear SI methods by over 25%.
Extensive empirical investigations showcase SIMBa's flexibility and performance on various systems.
Free Parametrization of Schur Matrices:
Proposition 1 provides an LMI-based free parametrization of Schur matrices for stability.
Numerical Experiments:
SIMBa demonstrates superior performance on random stable models and real-world data.
It outperforms traditional SI methods and achieves stability without sacrificing accuracy.
Training Time:
SIMBa L, with longer training times, shows improved performance but requires more computational resources.
Stats
SIMBa often outperforms traditional SI methods by over 25%.
SIMBa shows improvements of up to 73–86% compared to MATLAB on real-world data.
Training SIMBa ranges from 5 to 25 minutes on a MacBook Pro.
Quotes
"SIMBa generally outperforms traditional linear state-space SI methods."
"SIMBa consistently attained the best performance for meaningful choices of n."