Having a model that can dynamically learn new classes while detecting Out-of-Distribution (OOD) samples is a desirable property for most applications operating in the wild. While there is limited work in this direction, some works have attempted to achieve both by combining Incremental Learning (IL) and OOD detection, showing promising results for both tasks. Most of the works use a buffer containing some samples to either replay past samples while learning or to detect outliers at testing, which can cause potential issues: it does not scale well with a growing number of samples, it causes privacy issues as storing samples may not always be a compliant option, it limits the outlier detection to the distribution in the buffer, and it is computationally and memory expensive. In this work, we tackle this issue with a very simple yet effective framework: BUILD which performs both IL and OOD detection in a buffer-free manner with the capability to work in the wild. BUILD integrates a pre-trained vision transformer that is fine-tuned with hard attention masks, along with post-hoc OOD detectors applied during testing. We show that BUILD when combined with activation-based post-hoc OOD technique, can give not just competitive but better performance than the SOTA baselines. To support our claims, we evaluate the proposed framework on the CIFAR-10 classification benchmark and the results show that BUILD gives superior and stabler performance in detecting OOD samples in computationally much cheaper way.
International Conference on Machine Learning (ICML)
2024-07-03
2024-10-08