Explainable scale distillation for hyperspectral image classification

作者:

Highlights:

• This study presents a scale distillation network inspired by the knowledge distillation scheme for acquiring the multi-scale information of land-covers. The multi-scale information is distilled to a single-scale student model for reducing the time cost and improving the classification performance. To the best of our knowledge, this is the first time to incorporate the knowledge distillation on lightweight multi-scale hyperspectral image classification.

• An explainable scale network is proposed to present a more precise explanation to the predictions. The relationship between the scale features and the land-cover categories is analyzed, and the potential applications of the explainable scale network are briefly discussed.

• In experiments, three groups of experimental images are used for HSI classification. Experimental results show the superiority of the proposed scale distillation network over several state-of-the-art methods; furthermore, a visual analysis experiment is given to explain the trained scale distillation network.

摘要

•This study presents a scale distillation network inspired by the knowledge distillation scheme for acquiring the multi-scale information of land-covers. The multi-scale information is distilled to a single-scale student model for reducing the time cost and improving the classification performance. To the best of our knowledge, this is the first time to incorporate the knowledge distillation on lightweight multi-scale hyperspectral image classification.•An explainable scale network is proposed to present a more precise explanation to the predictions. The relationship between the scale features and the land-cover categories is analyzed, and the potential applications of the explainable scale network are briefly discussed.•In experiments, three groups of experimental images are used for HSI classification. Experimental results show the superiority of the proposed scale distillation network over several state-of-the-art methods; furthermore, a visual analysis experiment is given to explain the trained scale distillation network.

论文关键词:Hyperspectral image classification,Knowledge distillation,Scale distillation,Explainable scale network

论文评审过程:Received 9 March 2021, Revised 30 August 2021, Accepted 9 September 2021, Available online 10 September 2021, Version of Record 20 September 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.108316