Training Neural Networks with Uncertain Data: A Flexible Mixture of Experts Approach
This paper introduces the "Uncertainty-aware Mixture of Experts" (uMoE) method, a novel approach that embeds aleatoric uncertainty into the training phase of neural networks, enabling effective management of uncertain input data.