The paper investigates the approximation properties of noisy Hankel matrix models, which arise naturally in the context of linear time-invariant (LTI) systems. When the input and output data are subject to measurement noise, the authors analyze the interplay between the amount of data, the depth of the Hankel matrices, and the overall error in the model.
The key findings are:
The singular values of random Hankel matrices diverge to infinity as the amount of data increases, but this rate can be accelerated by increasing the depth of the matrices.
Applying this insight to the full noisy model, the authors show that increasing the depth mitigates the effect of the noise term, leading to improved self-consistency in the model's predictions.
Numerical experiments demonstrate that simply reconfiguring the Hankel matrices to use a parsimonious depth has a dramatic positive impact on rollout performance, even without any data preprocessing. This depth selection is also shown to be useful for data-driven LQR control.
The authors conclude that the depth of Hankel matrices is an important parameter to consider when working with noisy input-output data, as it can significantly improve the approximation properties of the resulting model.
다른 언어로
소스 콘텐츠 기반
arxiv.org
더 깊은 질문