Abstract

This paper describes a new technique for solving multiclass learning problems by combining Freund and Schapire's boosting algorithm with the main ideas of Dietterich and Bakiri's method of error-correcting output codes (ECOC). Boosting is a general method of improving the accuracy of a given base or "weak" learning algorithm. ECOC is a robust method of solving multiclass learning problems by reducing to a sequence of two-class problems. We show that our new hybrid method has advantages of both: Like ECOC, our method only requires that the base learning algorithm work on binary-labeled data. Like boosting, we prove that the method comes with strong theoretical guarantees on the training and generalization error of the final combined hypothesis assuming only that the base learning algorithm perform slightly better than random guessing. Although previous methods were known for boosting multiclass problems, the new method may be significantly faster and require less programming effort in creating the base
\nlearning algorithm. We also compare the new algorithm
\nexperimentally to other voting methods.

Keywords

Boosting (machine learning)Computer scienceArtificial intelligenceMachine learningMulticlass classificationGeneralization errorBinary numberBase (topology)Ensemble learningClass (philosophy)AlgorithmMathematicsArtificial neural networkSupport vector machine

Affiliated Institutions

Related Publications

Publication Info

Year
1997
Type
article
Pages
313-321
Citations
263
Access
Closed

External Links

Citation Metrics

263
OpenAlex

Cite This

Robert E. Schapire (1997). Using output codes to boost multiclass learning problems. , 313-321.