Learning representative features via constrictive annular loss for image classification

作者:Jun-Bo Liu, Ya-Ping Huang, Qi Zou, Sheng-Chun Wang

摘要

Deep convolutional neural networks (DCNNs) have achieved significant performance on image classification task. How to use more powerful loss function to train robust DCNN for image classification has become a recent trend in the community. In this paper, we present an elegant yet effective loss function: Constrictive Annular Loss (CA-Loss), to boost the classification performance of the DCNNs. CA-Loss can adaptively constrict the features to a suitable scale leading to more representative features, even for the imbalanced dataset. CA-Loss can be easily combined with softmax loss to jointly supervise the DCNNs. Furthermore, CA-Loss does not require additional supervisory information, and it can be easily optimized by the classical optimization algorithm (e.g. Stochastic gradient descent). We conduct extensive experiments on two large scale classification benchmarks and three artificially imbalanced datasets. CA-Loss achieves the state-of-the-art accuracy on these datasets, which strongly demonstrates the effectiveness of our proposed loss function.

论文关键词:Image classification, Representative features, Imbalanced dataset, Low-quality samples, Deep convolutional neural networks

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-019-01434-3