Sparse Code Shrinkage: Denoising of Nongaussian Data by Maximum Likelihood Estimation

1999 Neural Computation 386 citations

Abstract

Sparse coding is a method for finding a representation of data in which each of the components of the representation is only rarely significantly active. Such a representation is closely related to redundancy reduction and independent component analysis, and has some neurophysiological plausibility. In this article, we show how sparse coding can be used for denoising. Using maximum likelihood estimation of nongaussian variables corrupted by gaussian noise, we show how to apply a soft-thresholding (shrinkage) operator on the components of sparse coding so as to reduce noise. Our method is closely related to the method of wavelet shrinkage, but it has the important benefit over wavelet methods that the representation is determined solely by the statistical properties of the data. The wavelet representation, on the other hand, relies heavily on certain mathematical properties (like self-similarity) that may be only weakly related to the properties of natural data.

Keywords

WaveletPattern recognition (psychology)Sparse approximationShrinkageThresholdingNoise reductionNeural codingMathematicsArtificial intelligenceRepresentation (politics)AlgorithmComputer scienceStatistics

Related Publications

Publication Info

Year
1999
Type
article
Volume
11
Issue
7
Pages
1739-1768
Citations
386
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

386
OpenAlex

Cite This

Aapo Hyvärinen (1999). Sparse Code Shrinkage: Denoising of Nongaussian Data by Maximum Likelihood Estimation. Neural Computation , 11 (7) , 1739-1768. https://doi.org/10.1162/089976699300016214

Identifiers

DOI
10.1162/089976699300016214