A unified, well-structured codebase called OpenOOD is built, which implements over 30 methods developed in relevant fields and provides a comprehensive benchmark under the recently proposed generalized OOD detection framework.
Out-of-distribution (OOD) detection is vital to safety-critical machine learning applications and has thus been extensively studied, with a plethora of methods developed in the literature. However, the field currently lacks a unified, strictly formulated, and comprehensive benchmark, which often results in unfair comparisons and inconclusive results. From the problem setting perspective, OOD detection is closely related to neighboring fields including anomaly detection (AD), open set recognition (OSR), and model uncertainty, since methods developed for one domain are often applicable to each other. To help the community to improve the evaluation and advance, we build a unified, well-structured codebase called OpenOOD, which implements over 30 methods developed in relevant fields and provides a comprehensive benchmark under the recently proposed generalized OOD detection framework. With a comprehensive comparison of these methods, we are gratified that the field has progressed significantly over the past few years, where both preprocessing methods and the orthogonal post-hoc methods show strong potential.
Wayne Zhang
11 papers
Yixuan Li
9 papers
Jingkang Yang
10 papers
Haoqi Wang
2 papers
Zitang Zhou
2 papers
Yiyou Sun
2 papers
Wen-Hsiao Peng
2 papers
Pengyun Wang
1 papers
Dejian Zou
1 papers
Kun Ding
1 papers
Guangyao Chen
1 papers
Bo Li
3 papers
Xuefeng Du
1 papers