The content discusses the Kernel Multigrid (KMG) algorithm to improve Back-fitting convergence using sparse Gaussian Process Regression (GPR). It introduces Additive Gaussian Processes, Bayesian Back-fitting, and Kernel Packets. The article outlines the challenges of training additive GPs due to computational complexity and proposes KMG as a solution. It explains the theoretical basis, numerical experiments, and lower bounds for convergence rates.
他の言語に翻訳
原文コンテンツから
arxiv.org
深掘り質問