Adaptive multi-scale transductive information propagation for few-shot learning

作者:

Highlights:

摘要

Few-shot learning aims to learn a classifier with more generalization capability from extremely limited labeled samples has drawn an increasing amount of attention in many areas. One typical work in this field is the transductive propagation network (TPN), which propagates labels by capturing the local geometry distribution information between data. However, TPN exploited only the manifold structure of feature extractor networks’ high-layer semantic features (single-scale) distribution, while neglecting their local geometry distribution in the low-layer semantic features. This paper presents the adaptive multi-scale transductive information propagation (AMTIP) model to address this problem. Specifically, we introduce the multi-scale feature extractor networks to simultaneously learn samples’ high-layer global semantic features and low-layer local semantic features. After acquiring the multi-scale features, we propose the adaptive multi-scale fusion networks to generate the adaptive semantic fusion features applying to different few-shot classification tasks. Finally, the adaptive semantic fusion features are applied to the label propagation model for few-shot image classification. Compared with the single-scale semantic features of TPN, AMTIP can better preserve the local geometry between the support set and query set data via the adaptive multi-scale semantic fusion features. Extensive experiments demonstrate the superiority of the proposed AMTIP compared to the state-of-the-art few-shot classification models.

论文关键词:Few-shot learning,Transductive learning,Multi-scale information

论文评审过程:Received 22 December 2021, Revised 30 April 2022, Accepted 3 May 2022, Available online 11 May 2022, Version of Record 20 May 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108979