Two improved attribute weighting schemes for value difference metric
作者:Liangxiao Jiang, Chaoqun Li
摘要
Due to its simplicity, efficiency and efficacy, value difference metric (VDM) has continued to perform well against more sophisticated newcomers and thus has remained of great interest to the distance metric learning community. Of numerous approaches to improving VDM by weakening its attribute independence assumption, attribute weighting has received less attention (only two attribute weighting schemes) but demonstrated remarkable class probability estimation performance. Among two existing attribute weighting schemes, one is non-symmetric and the other is symmetric. In this paper, we propose two simple improvements for setting attribute weights for use with VDM. One is the non-symmetric Kullback–Leibler divergence weighted value difference metric (KLD-VDM) and the other is the symmetric gain ratio weighted value difference metric (GR-VDM). We performed extensive evaluations on a large number of datasets and found that KLD-VDM and GR-VDM significantly outperform two existing attribute weighting schemes in terms of the negative conditional log likelihood and root relative squared error, yet at the same time maintain the computational simplicity and robustness that characterize VDM.
论文关键词:Distance metric learning, Value difference metric, Attribute weighting, Gain ratio, Kullback–Leibler divergence
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10115-018-1229-3