A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning

作者:

Highlights:

摘要

In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0/1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.

论文关键词:Cost-sensitive learning,Supervised learning,Statistical learning theory,Classification

论文评审过程:Received 26 October 2007, Revised 16 December 2008, Accepted 19 December 2008, Available online 6 January 2009.

论文官网地址:https://doi.org/10.1016/j.patcog.2008.12.011