Informative knowledge distillation for image anomaly segmentation

作者:

Highlights:

摘要

Unsupervised anomaly segmentation methods based on knowledge distillation have recently been developed and have shown superior segmentation performance. However, little attention has been paid to the overfitting problem caused by the inconsistency between the capacity of a neural network and the amount of knowledge in this scheme. This study proposes a novel method called informative knowledge distillation (IKD) to address the overfitting problem by distilling informative knowledge and offering a strong supervisory signal. Technically, a novel context similarity loss method is proposed to capture context information from normal data manifolds. In addition, a novel adaptive hard sample mining method is proposed to encourage more attention on hard samples with valuable information. With IKD, informative knowledge can be distilled such that the overfitting problem can be effectively mitigated, and the performance can be further increased. The proposed method achieved better results on several categories of the well-known MVTec AD dataset than state-of-the-art methods in terms of AU-ROC, achieving 97.81% overall in 15 categories. Extensive experiments on ablation have also been conducted to demonstrate the effectiveness of IKD in alleviating the overfitting problem.

论文关键词:Image anomaly segmentation,Knowledge distillation,Context similarity,Hard sample mining

论文评审过程:Received 21 November 2021, Revised 15 April 2022, Accepted 15 April 2022, Available online 26 April 2022, Version of Record 10 May 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108846