Online Subclass Knowledge Distillation

作者:

Highlights:

• A novel distillation method aiming to reveal the subclass similarities is proposed.

• The OSKD method derives the soft labels from the model itself, in an online manner.

• The OSKD method is model-agnostic.

• The experiments validate the effectiveness of the OSKD method.

摘要

•A novel distillation method aiming to reveal the subclass similarities is proposed.•The OSKD method derives the soft labels from the model itself, in an online manner.•The OSKD method is model-agnostic.•The experiments validate the effectiveness of the OSKD method.

论文关键词:Knowledge distillation,Online distillation,Subclass knowledge distillation,Self distillation,Deep neural networks

论文评审过程:Received 21 April 2020, Revised 7 March 2021, Accepted 27 April 2021, Available online 4 May 2021, Version of Record 15 May 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2021.115132