Addressing Model Collapse in Gaussian Process Latent Variable Models through Projection Variance Learning and Flexible Kernel Integration
The core message of this paper is to address the issue of model collapse in Gaussian Process Latent Variable Models (GPLVMs) by: 1) theoretically examining the impact of the projection variance on model collapse, and 2) integrating a flexible spectral mixture kernel with a differentiable random Fourier feature approximation to enhance kernel flexibility and enable efficient and scalable learning.