Abstract

We introduce the smoothed analysis of algorithms , which continuously interpolates between the worst-case and average-case analyses of algorithms. In smoothed analysis, we measure the maximum over inputs of the expected performance of an algorithm under small random perturbations of that input. We measure this performance in terms of both the input size and the magnitude of the perturbations. We show that the simplex algorithm has smoothed complexity polynomial in the input size and the standard deviation of Gaussian perturbations.

Keywords

AlgorithmMeasure (data warehouse)GaussianMathematicsComputer sciencePolynomialSimplexSimplex algorithmLinear programmingCombinatoricsData mining

Affiliated Institutions

Related Publications

On clusterings

We motivate and develop a natural bicriteria measure for assessing the quality of a clustering that avoids the drawbacks of existing measures. A simple recursive heuristic is sh...

2004 Journal of the ACM 842 citations

How fast is the k-means method?

We present polynomial upper and lower bounds on the number of iterations performed by the k-means method (a.k.a. Lloyd's method) for k-means clustering. Our upper bounds are pol...

2005 Symposium on Discrete Algorithms 46 citations

Publication Info

Year
2004
Type
article
Volume
51
Issue
3
Pages
385-463
Citations
823
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

823
OpenAlex

Cite This

Daniel A. Spielman, Shang‐Hua Teng (2004). Smoothed analysis of algorithms. Journal of the ACM , 51 (3) , 385-463. https://doi.org/10.1145/990308.990310

Identifiers

DOI
10.1145/990308.990310