Abstract

A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16 × 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.

Keywords

Kernel principal component analysisPrincipal component analysisEigenvalues and eigenvectorsNonlinear systemKernel (algebra)MathematicsPattern recognition (psychology)PixelPolynomialArtificial intelligenceKernel methodPolynomial kernelOperator (biology)Component (thermodynamics)AlgorithmFeature extractionComputer scienceMathematical analysisPure mathematicsSupport vector machine

Affiliated Institutions

Related Publications

Publication Info

Year
1998
Type
article
Volume
10
Issue
5
Pages
1299-1319
Citations
7939
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

7939
OpenAlex

Cite This

Bernhard Schölkopf, Alexander J. Smola, Klaus‐Robert Müller (1998). Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation , 10 (5) , 1299-1319. https://doi.org/10.1162/089976698300017467

Identifiers

DOI
10.1162/089976698300017467