SID: Incremental learning for anchor-free object detection via Selective and Inter-related Distillation

作者:

Highlights:

摘要

Incremental learning requires a model to continually learn new tasks from streaming data. However, traditional fine-tuning of a well-trained deep neural network on a new task will dramatically degrade performance on the old task — a problem known as catastrophic forgetting. In this paper, we address this issue in the context of anchor-free object detection, which is a new trend in computer vision as it is simple, fast, and flexible. Simply adapting current incremental learning strategies fails on these anchor-free detectors due to lack of consideration of their specific model structures. To deal with the challenges of incremental learning on anchor-free object detectors, we propose a novel incremental learning paradigm called Selective and Inter-related Distillation (SID). In addition, a novel evaluation metric is proposed to better assess the performance of detectors under incremental learning conditions. By selective distilling at the proper locations and further transferring additional instance relation knowledge, our method demonstrates significant advantages on the benchmark datasets PASCAL VOC and COCO.

论文关键词:

论文评审过程:Received 20 December 2020, Revised 21 May 2021, Accepted 24 May 2021, Available online 29 May 2021, Version of Record 2 June 2021.

论文官网地址:https://doi.org/10.1016/j.cviu.2021.103229