The article introduces the MLOD framework for improving out-of-distribution (OOD) detection by leveraging multiple hypothesis testing and layer-wise feature fusion. It addresses the challenge of detecting diverse test inputs that differ significantly from training data. The proposed approach, MLOD, does not require structural modifications or fine-tuning of pre-trained classifiers. By utilizing feature extractors at varying depths, MLOD enhances OOD detection performance compared to baseline methods. Experimental results show that MLOD-Fisher significantly reduces false positive rates on average when trained using KNN on CIFAR10.
다른 언어로
소스 콘텐츠 기반
arxiv.org
더 깊은 질문