Adaptive weighted learning for linear regression problems via Kullback–Leibler divergence

作者:

Highlights:

摘要

In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback–Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics.

论文关键词:(KL),Kullback–Leibler,(GLR),Global linear regression,(COLR),Class-oriented linear regression,(WGLR),Weighted global linear regression,(WCOLR),Weighed class-oriented linear regression,(FLR),The fusion of GLR and COLR,(WFLR),The fusion of WGLR and WCOLR,(NN),Nearest neighbor,(RLR),Robust linear regression,(MC),Maximum correntropy,(SRC),Sparse representation classification,Linear regression,KL divergence,Weighted learning,Alternative optimization,Image classification

论文评审过程:Received 5 April 2012, Revised 28 September 2012, Accepted 24 October 2012, Available online 2 November 2012.

论文官网地址:https://doi.org/10.1016/j.patcog.2012.10.017