An Empirical Comparison of Selection Measures for Decision-Tree Induction

1989 Machine Learning 462 citations

Keywords

Tree (set theory)PruningComputer scienceDecision treeMeasure (data warehouse)Reliability (semiconductor)Artificial intelligenceData miningMachine learningSet (abstract data type)Incremental decision treeMathematicsPattern recognition (psychology)Decision tree learning

Affiliated Institutions

Related Publications

Bagging, boosting, and C4.S

Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the predictive power of classifier learning systems. Both form a set of classifiers that ar...

1996 National Conference on Artificial Int... 1262 citations

Best-first Decision Tree Learning

Decision trees are potentially powerful predictors and explicitly represent the structure of a dataset. Standard decision tree learners such as C4.5 expand nodes in depth-first ...

2007 Research Commons (University of Waikato) 229 citations

Publication Info

Year
1989
Type
article
Volume
3
Issue
4
Pages
319-342
Citations
462
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

462
OpenAlex

Cite This

John Mingers (1989). An Empirical Comparison of Selection Measures for Decision-Tree Induction. Machine Learning , 3 (4) , 319-342. https://doi.org/10.1023/a:1022645801436

Identifiers

DOI
10.1023/a:1022645801436