Instance exploitation for learning temporary concepts from sparsely labeled drifting data streams
作者:
Highlights:
• Enhancing drift adaptation in sparsely labeled data streams at no additional cost.
• Instance exploitation techniques to empower active learning and avoid underfitting.
• Ensemble architectures adaptively switching between risky and standard adaptation.
• Flexible framework that can be used to enhance any online active learning algorithm.
• In-depth analysis of enhanced drift adaptation via extensive experimental study.
摘要
•Enhancing drift adaptation in sparsely labeled data streams at no additional cost.•Instance exploitation techniques to empower active learning and avoid underfitting.•Ensemble architectures adaptively switching between risky and standard adaptation.•Flexible framework that can be used to enhance any online active learning algorithm.•In-depth analysis of enhanced drift adaptation via extensive experimental study.
论文关键词:Machine learning,Data stream mining,Concept drift,Sparse labeling,Active learning
论文评审过程:Received 9 August 2020, Revised 1 November 2021, Accepted 25 April 2022, Available online 27 April 2022, Version of Record 3 May 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2022.108749