Enhancing Calibration and Out-of-Distribution Detection in Bayesian Neural Networks via Regularization, Confidence Minimization, and Selective Inference
This paper proposes an extension of variational inference-based Bayesian learning that integrates calibration regularization for improved in-distribution performance, confidence minimization for enhanced out-of-distribution detection, and selective calibration to ensure a synergistic use of calibration regularization and confidence minimization.