Gradient and Newton boosting for classification and regression

作者:

Highlights:

• Present gradient, Newton, and hybrid gradient-Newton boosting in a unified framework.

• Show that Newton boosting achieves significantly higher predictive accuracy.

• Reason for higher predictive accuracy is not faster convergence.

• Introduce novel interpretable tuning parameter which is important for predictive accuracy.

摘要

•Present gradient, Newton, and hybrid gradient-Newton boosting in a unified framework.•Show that Newton boosting achieves significantly higher predictive accuracy.•Reason for higher predictive accuracy is not faster convergence.•Introduce novel interpretable tuning parameter which is important for predictive accuracy.

论文关键词:Boosting,Supervised learning,Ensembles,Trees

论文评审过程:Received 23 July 2019, Revised 30 September 2020, Accepted 30 September 2020, Available online 20 October 2020, Version of Record 10 February 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2020.114080