Core Concepts
Multitesting-based Layer-wise Out-of-Distribution (OOD) Detection (MLOD) enhances OOD detection performance without modifying pre-trained classifiers.
Abstract
The article introduces the MLOD framework for improving out-of-distribution (OOD) detection by leveraging multiple hypothesis testing and layer-wise feature fusion. It addresses the challenge of detecting diverse test inputs that differ significantly from training data. The proposed approach, MLOD, does not require structural modifications or fine-tuning of pre-trained classifiers. By utilizing feature extractors at varying depths, MLOD enhances OOD detection performance compared to baseline methods. Experimental results show that MLOD-Fisher significantly reduces false positive rates on average when trained using KNN on CIFAR10.
Stats
When trained using KNN on CIFAR10, MLOD-Fisher significantly lowers the false positive rate (FPR) from 24.09% to 7.47% on average compared to merely utilizing the features of the last layer.