A decision cognizant Kullback–Leibler divergence

作者:

Highlights:

• The decision cognizant Kullback–Leibler divergence is a better statistic to measure classifier (in)congruence.

• Analytic and simulation studies show the new divergence is more robust to minority class clutter.

• Sensitivity to estimation error is lower than that of the classical Kullback–Leibler divergence.

摘要

Highlights•The decision cognizant Kullback–Leibler divergence is a better statistic to measure classifier (in)congruence.•Analytic and simulation studies show the new divergence is more robust to minority class clutter.•Sensitivity to estimation error is lower than that of the classical Kullback–Leibler divergence.

论文关键词:Kullback–Leibler divergence,Divergence clutter,Classifier incongruence

论文评审过程:Received 23 May 2016, Revised 12 August 2016, Accepted 18 August 2016, Available online 20 August 2016, Version of Record 31 August 2016.

论文官网地址:https://doi.org/10.1016/j.patcog.2016.08.018