Abstract

The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights and threshold such as to minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by $k$--means clustering and the weights are found using error backpropagation. We consider three machines, namely a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the US postal service database of handwritten digits, the SV machine achieves the highest test accuracy, followed by the hybrid approach. The SV approach is thus not only theoretically well--founded, but also superior in a practical application.

Keywords

BackpropagationSupport vector machineArtificial intelligenceRadial basis functionArtificial neural networkComputer sciencePattern recognition (psychology)Cluster analysisKernel (algebra)Radial basis function networkGaussian functionGaussianRadial basis function kernelAlgorithmMachine learningKernel methodMathematics

Affiliated Institutions

Related Publications

Publication Info

Year
1997
Type
article
Volume
45
Issue
11
Pages
2758-2765
Citations
1375
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1375
OpenAlex

Cite This

Bernhard Schölkopf, Kah-Kay Sung, Chris Burges et al. (1997). Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Transactions on Signal Processing , 45 (11) , 2758-2765. https://doi.org/10.1109/78.650102

Identifiers

DOI
10.1109/78.650102