The content discusses the Kernel Multigrid (KMG) algorithm to improve Back-fitting convergence using sparse Gaussian Process Regression (GPR). It introduces Additive Gaussian Processes, Bayesian Back-fitting, and Kernel Packets. The article outlines the challenges of training additive GPs due to computational complexity and proposes KMG as a solution. It explains the theoretical basis, numerical experiments, and lower bounds for convergence rates.
翻譯成其他語言
從原文內容
arxiv.org
深入探究