Likelihood-based deep generative models are used for high-dimensional data distribution approximation and out-of-distribution (OOD) detection. The Fisher information metric is approximated to measure gradient sizes for OOD detection. Layer-wise gradient norms are analyzed, showing their effectiveness in OOD detection. A model-agnostic method using layer-wise gradient norms outperforms the Typicality test for most deep generative models.
Neural networks can be confident but incorrect with inputs different from the training data distribution. Deep generative models face challenges in accurately detecting OOD data due to higher log-likelihoods for such data. The study proposes a method based on the gradient of a data point with respect to model parameters for OOD detection, formalizing it as approximating the Fisher information metric.
toiselle kielelle
lähdeaineistosta
arxiv.org
Tärkeimmät oivallukset
by Sam Dauncey,... klo arxiv.org 03-05-2024
https://arxiv.org/pdf/2403.01485.pdfSyvällisempiä Kysymyksiä