Abstract

The title Lasso has been suggested by Tibshirani (1996) as a colourful name for a technique of variable selection which requires the minimization of a sum of squares subject to an l1 bound κ on the solution. This forces zero components in the minimizing solution for small values of κ. Thus this bound can function as a selection parameter. This paper makes two contributions to computational problems associated with implementing the Lasso: (1) a compact descent method for solving the constrained problem for a particular value of κ is formulated, and (2) a homotopy method, in which the constraint bound κ becomes the homotopy parameter, is developed to completely describe the possible selection regimes. Both algorithms have a finite termination property. It is suggested that modified Gram-Schmidt orthogonalization applied to an augmented design matrix provides an effective basis for implementing the algorithms.

Keywords

MathematicsLasso (programming language)OrthogonalizationMathematical optimizationHomotopyLeast-squares function approximationSelection (genetic algorithm)Applied mathematicsConstraint (computer-aided design)Feature selectionUpper and lower boundsFunction (biology)AlgorithmEstimatorComputer scienceArtificial intelligencePure mathematicsStatisticsMathematical analysis

Related Publications

Publication Info

Year
2000
Type
article
Volume
20
Issue
3
Pages
389-403
Citations
846
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

846
OpenAlex

Cite This

M. R. Osborne, Brett Presnell, Berwin A. Turlach (2000). A new approach to variable selection in least squares problems. IMA Journal of Numerical Analysis , 20 (3) , 389-403. https://doi.org/10.1093/imanum/20.3.389

Identifiers

DOI
10.1093/imanum/20.3.389