Abstract
A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning machines whose performance in optical character recognition problems is dramatically improved over that of a single network. We report the effect of boosting on four databases (all handwritten) consisting of 12,000 digits from segmented ZIP codes from the United State Postal Service (USPS) and the following from the National Institute of Standards and Testing (NIST): 220,000 digits, 45,000 upper case alphas, and 45,000 lower case alphas. We use two performance measures: the raw error rate (no rejects) and the reject rate required to achieve a 1% error rate on the patterns not rejected. Boosting improved performance in some cases by a factor of three.
Keywords
Affiliated Institutions
Related Publications
BOOSTING PERFORMANCE IN NEURAL NETWORKS
A boosting algorithm, based on the probably approximately correct (PAC) learning model is used to construct an ensemble of neural networks that significantly improves performanc...
Boosting and Other Ensemble Methods
We compare the performance of three types of neural network-based ensemble techniques to that of a single neural network. The ensemble algorithms are two versions of boosting an...
Handwritten Digit Recognition with a Back-Propagation Network
We present an application of back-propagation networks to handwritten digit recognition. Minimal preprocessing of the data was required, but architecture of the network was high...
<b>ada</b>: An<i>R</i>Package for Stochastic Boosting
Boosting is an iterative algorithm that combines simple classification rules with "mediocre" performance in terms of misclassification error rate to produce a highly accurate cl...
Experiments with a new boosting algorithm
In an earlier paper, we introduced a new &quot;boosting&quot; algorithm called AdaBoost which, theoretically, can be used to significantly reduce the error of any learni...
Publication Info
- Year
- 1992
- Type
- article
- Volume
- 5
- Pages
- 42-49
- Citations
- 158
- Access
- Closed