Forest PA: Constructing a decision forest by penalizing attributes used in previous trees

作者:

Highlights:

• Forest PA assigns weights only on the attributes appearing in the latest tree.

• Weights are obtained randomly from dynamically determined weight ranges.

• Weights are incremented if the attributes do not appear in the subsequent tree(s).

• Forest PA is applied on 20 well known data sets.

• The experimental results indicate the effectiveness of Forest PA.

摘要

•Forest PA assigns weights only on the attributes appearing in the latest tree.•Weights are obtained randomly from dynamically determined weight ranges.•Weights are incremented if the attributes do not appear in the subsequent tree(s).•Forest PA is applied on 20 well known data sets.•The experimental results indicate the effectiveness of Forest PA.

论文关键词:Classification,Decision Tree,Decision Forest,Random Forest,Ensemble Accuracy

论文评审过程:Received 28 January 2017, Revised 31 July 2017, Accepted 1 August 2017, Available online 3 August 2017, Version of Record 8 August 2017.

论文官网地址:https://doi.org/10.1016/j.eswa.2017.08.002