Monotonicity Maintenance in Information-Theoretic Machine Learning Algorithms

作者:Arie Ben-David

摘要

Decision trees that are based on information-theory are useful paradigms for learning from examples. However, in some real-world applications, known information-theoretic methods frequently generate nonmonotonic decision trees, in which objects with better attribute values are sometimes classified to lower classes than objects with inferior values. This property is undesirable for problem solving in many application domains, such as credit scoring and insurance premium determination, where monotonicity of subsequent classifications is important. An attribute-selection metric is proposed here that takes both the error as well as monotonicity into account while building decision trees. The metric is empirically shown capable of significantly reducing the degree of non-monotonicity of decision trees without sacrificing their inductive accuracy.

论文关键词:information theory, monotonic decision trees, consistency, accuracy, monotonic classification problems

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1022655006810