Better Subset Regression Using the Nonnegative Garrote

1995 Technometrics 617 citations

Abstract

Abstract A new method, called the nonnegative (nn) garrote, is proposed for doing subset regression. It both shrinks and zeroes coefficients. In tests on real and simulated data, it produces lower prediction error than ordinary subset selection. It is also compared to ridge regression. If the regression equations generated by a procedure do not change drastically with small changes in the data, the procedure is called stable. Subset selection is unstable, ridge is very stable, and the nn-garrote is intermediate. Simulation results illustrate the effects of instability on prediction error. KEY WORDS: Little bootstrapModel errorPredictionStability

Keywords

RegressionRidgeMathematicsSelection (genetic algorithm)Regression analysisStatisticsLinear regressionMean squared prediction errorStability (learning theory)Stepwise regressionInstabilityApplied mathematicsComputer scienceArtificial intelligenceMachine learningGeology

Affiliated Institutions

Related Publications

Publication Info

Year
1995
Type
article
Volume
37
Issue
4
Pages
373-384
Citations
617
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

617
OpenAlex

Cite This

Leo Breiman (1995). Better Subset Regression Using the Nonnegative Garrote. Technometrics , 37 (4) , 373-384. https://doi.org/10.1080/00401706.1995.10484371

Identifiers

DOI
10.1080/00401706.1995.10484371