Abstract

Computational comparison is made between two feature selection approaches for nding a separating plane that discriminates between two point sets in an n-dimensional feature space that utilizes as few of the n features (dimensions) as possible. In the concaveminimization approach [19,5] a separating plane is generated by minimizing a weighted sum of distances of misclassi ed points to two parallel planes that bound the sets and which determine the separating plane midway between them. Furthermore, the number of dimensions of the space used to determine the plane is minimized. In the support vector machine approach [27, 7, 1, 10, 24, 28], in addition to minimizing the weighted sum of distances of misclassi ed points to the bounding planes, we also maximize the distance between the two bounding planes that generate the separating plane. Computational results show that feature suppression is an indirect consequence of the support vector machine approach when an appropriate norm is used. Numerical tests on 6 public data sets show that classi ers trained by the concave minimization approach and those trained by a support vector machine have comparable 10fold cross-validation correctness. However, in all data sets tested, the classi ers obtained by the concave minimization approach selected fewer problem features than those trained by a support vector machine.

Keywords

MinificationSupport vector machineComputer scienceSelection (genetic algorithm)Feature selectionArtificial intelligenceFeature (linguistics)Pattern recognition (psychology)Algorithm

Related Publications

Publication Info

Year
1998
Type
article
Pages
82-90
Citations
993
Access
Closed

External Links

Citation Metrics

993
OpenAlex

Cite This

Paul S. Bradley, O. L. Mangasarian (1998). Feature Selection via Concave Minimization and Support Vector Machines. , 82-90.