Abstract

This paper introduces OC1, a new algorithm for generating multivariate decision trees. Multivariate trees classify examples by testing linear combinations of the features at each non-leaf node of the tree. Each test is equivalent to a hyperplane at an oblique orientation to the axes. Because of the computational intractability of finding an optimal orientation for these hyperplanes, heuristic methods must be used to produce good trees. This paper explores a new method that combines deterministic and randomized procedures to search for a good tree. Experiments on several different real-world data sets demonstrate that the method consistently finds much smaller trees than comparable methods using univariate tests. In addition, the accuracy of the trees found with our method matches or exceeds the best results of other machine learning methods. 1 Introduction Decision trees (DTs) have been used quite extensively in the machine learning literature for a wide range of classification probl...

Keywords

HyperplaneOblique caseUnivariateDecision treeComputer scienceMultivariate statisticsNode (physics)Tree (set theory)Orientation (vector space)Search treeArtificial intelligenceAlgorithmMathematicsData miningMachine learningSearch algorithmCombinatoricsEngineering

Affiliated Institutions

Related Publications

Best-first Decision Tree Learning

Decision trees are potentially powerful predictors and explicitly represent the structure of a dataset. Standard decision tree learners such as C4.5 expand nodes in depth-first ...

2007 Research Commons (University of Waikato) 229 citations

Publication Info

Year
1993
Type
article
Pages
322-327
Citations
124
Access
Closed

External Links

Citation Metrics

124
OpenAlex

Cite This

Sreerama K. Murthy, Simon Kasif, Steven L. Salzberg et al. (1993). OC1: randomized induction of oblique decision trees. , 322-327.