Abstract

Memory-based classification algorithms such as radial basis functions or K-nearest neighbors typically rely on simple distances (Euclidean, dot product...), which are not particularly meaningful on pattern vectors. More complex, better suited distance measures are often expensive and rather ad-hoc (elastic matching, deformable templates). We propose a new distance measure which (a) can be made locally invariant to any set of transformations of the input and (b) can be computed efficiently. We tested the method on large handwritten character databases provided by the Post Office and the NIST. Using invariances with respect to translation, rotation, scaling, shearing and line thickness, the method consistently outperformed all other systems tested on the same databases.

Keywords

Euclidean distanceNISTComputer sciencePattern recognition (psychology)Dot productScalingArtificial intelligenceInvariant (physics)k-nearest neighbors algorithmTranslation (biology)Transformation (genetics)AlgorithmMathematicsSpeech recognitionGeometry

Affiliated Institutions

Related Publications

Publication Info

Year
1992
Type
article
Volume
5
Pages
50-58
Citations
464
Access
Closed

External Links

Citation Metrics

464
OpenAlex

Cite This

Patrice Simard, Yann LeCun, John S. Denker (1992). Efficient Pattern Recognition Using a New Transformation Distance. Neural Information Processing Systems , 5 , 50-58.