3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in disjoint-10-1-10
Use these libraries to find disjoint-10-1-10 models and implementations
No datasets available.
No subtasks available.
This work proposes the Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities, and performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques.
This work formally introduces the incremental learning problem for semantic segmentation in which a pixel-wise labeling is considered and proposes various approaches working both on the output logits and on intermediate features.
Local POD is proposed, a multi-scale pooling distillation scheme that preserves long- and short-range spatial relationships at feature level that significantly outperforms state-of-the-art methods in existing CSS scenarios, as well as in newly proposed challenging benchmarks 1.
This work revisits classical incremental learning methods, and proposes a new distillation-based framework which explicitly accounts for a semantic distribution shift, and introduces a novel strategy to initialize classifier's parameters, thus preventing biased predictions toward the background class.
This work proposes to use a structural re-parameterization mechanism, named representation compensation (RC) module, to decouple the representation learning of both old and new knowledge, and outperforms state-of-the-art performance.
This paper proposes a new method, dubbed SSUL-M (Semantic Segmentation with Unknown Label with Memory), by carefully combining techniques tailored for semantic segmentation by utilizing tiny exemplar memory for the first time in CISS to improve both plasticity and stability.
Adding a benchmark result helps the community track progress.