Scale fusion light CNN for hyperspectral face recognition with knowledge distillation and attention mechanism

作者:Jie-Yi Niu, Zhi-Hua Xie, Yi Li, Si-Jia Cheng, Jia-Wei Fan

摘要

Hyperspectral imaging technology, combining traditional imaging and spectroscopy technologies to simultaneously acquire spatial and spectral information, is deemed to be an intuitive medium for robust face recognition. However, the intrinsic structure of hyperspectral images is more complicated than ordinary gray-scale or RGB images, how to fully explore the discriminant and correlation features with only a limited number of hyperspectral samples for deep learning training has not been well studied. In response to these problems, this paper proposes an end-to-end multiscale fusion lightweight convolution neural network (CNN) framework for hyperspectral face recognition, termed as the features fusion with channel attention network (FFANet). Firstly, to capture richer subtle details, we introduce Second-Order Efficient Channel Attention (SECA) as the variant of Efficient Channel Attention (ECA) into the framework. The difference from ECA is that SECA can extract the second-order information of each channel to improve the network’s feature extraction ability and is more suitable for the complexity of hyperspectral data. Secondly, we further fuse multiscale information to yield a comprehensive and discriminative representation learning. Finally, the joint of Self-Supervision and Knowledge Distillation (SSKD) is exploited to train an efficient deep model, which can learn more dark knowledge from the trained teacher network. The experimental results on three benchmark hyperspectral face databases of PolyU, CMU, and UWA show that the proposed approach has achieved competitive accuracy and efficiency on the basis of significantly reducing the storage space and computation overheads. These characteristics also show its wide applicability on edge/mobile devices.

论文关键词:Hyperspectral imaging, Face recognition, Self-supervision, Attention mechanism, Knowledge distillation

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-021-02721-8