Abstract

We propose and evaluate a family of methods for converting classifier learning algorithms and classification theory into cost-sensitive algorithms and theory. The proposed conversion is based on cost-proportionate weighting of the training examples, which can be realized either by feeding the weights to the classification algorithm (as often done in boosting), or by careful subsampling. We give some theoretical performance guarantees on the proposed methods, as well as empirical evidence that they are practical alternatives to existing approaches. In particular, we propose costing, a method based on cost-proportionate rejection sampling and ensemble aggregation, which achieves excellent predictive performance on two publicly available datasets, while drastically reducing the computation required by other methods. 1

Keywords

Boosting (machine learning)WeightingActivity-based costingComputer scienceComputationClassifier (UML)Machine learningArtificial intelligenceEnsemble learningData miningAlgorithm

Affiliated Institutions

Related Publications

Publication Info

Year
2004
Type
article
Pages
435-442
Citations
669
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

669
OpenAlex

Cite This

Bianca Zadrozny, John Langford, Naoki Abe (2004). Cost-sensitive learning by cost-proportionate example weighting. , 435-442. https://doi.org/10.1109/icdm.2003.1250950

Identifiers

DOI
10.1109/icdm.2003.1250950