Abstract
We introduce the smoothed analysis of algorithms , which continuously interpolates between the worst-case and average-case analyses of algorithms. In smoothed analysis, we measure the maximum over inputs of the expected performance of an algorithm under small random perturbations of that input. We measure this performance in terms of both the input size and the magnitude of the perturbations. We show that the simplex algorithm has smoothed complexity polynomial in the input size and the standard deviation of Gaussian perturbations.
Keywords
Affiliated Institutions
Related Publications
Worst-case and Smoothed Analysis of the ICP Algorithm, with an Application to the k-means Method
We show a worst-case lower bound and a smoothed upper bound on the number of iterations performed by the iterative closest point (ICP) algorithm. First proposed by Besl and McKa...
On clusterings
We motivate and develop a natural bicriteria measure for assessing the quality of a clustering that avoids the drawbacks of existing measures. A simple recursive heuristic is sh...
On clusterings-good, bad and spectral
We propose a new measure for assessing the quality of a clustering. A simple heuristic is shown to give worst-case guarantees under the new measure. Then we present two results ...
Robust Solutions to Least-Squares Problems with Uncertain Data
We consider least-squares problems where the coefficient matrices A, b are unknown but bounded. We minimize the worst-case residual error using (convex) second-order cone progra...
How fast is the k-means method?
We present polynomial upper and lower bounds on the number of iterations performed by the k-means method (a.k.a. Lloyd's method) for k-means clustering. Our upper bounds are pol...
Publication Info
- Year
- 2004
- Type
- article
- Volume
- 51
- Issue
- 3
- Pages
- 385-463
- Citations
- 823
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1145/990308.990310