Convergence Analysis of Online Regularized Statistical Learning in Reproducing Kernel Hilbert Space with Non-Stationary Data
The core message of this paper is to establish the mean square consistency of an online regularized learning algorithm in reproducing kernel Hilbert space (RKHS) with dependent and non-stationary data streams. The authors introduce the concept of random Tikhonov regularization path, and show that if the regularization path is slowly time-varying, then the output of the algorithm is consistent with the regularization path in mean square. Furthermore, if the data streams satisfy the RKHS persistence of excitation condition, then the output of the algorithm is consistent with the unknown function in mean square.