Improving imbalanced classification using near-miss instances

作者:

Highlights:

• Class-imbalanced classification can be improved by utilizing ‘near-miss’ instances.

• Side information ‘positivity’ is assumed for each instance that specifies near-miss.

• Treating near-misses as being partly positive reduces the estimation variance.

• Non-asymptotic bound and extensive experiments shows the superior performance.

摘要

•Class-imbalanced classification can be improved by utilizing ‘near-miss’ instances.•Side information ‘positivity’ is assumed for each instance that specifies near-miss.•Treating near-misses as being partly positive reduces the estimation variance.•Non-asymptotic bound and extensive experiments shows the superior performance.

论文关键词:Imbalanced classification,Learning using privileged information,Generalized distillation

论文评审过程:Received 7 September 2020, Revised 30 November 2021, Accepted 29 March 2022, Available online 10 April 2022, Version of Record 26 April 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2022.117130