In analysis of algorithms, **probabilistic analysis of algorithms** is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithm.

This approach is not the same as that of probabilistic algorithms, but the two may be combined.

For non-probabilistic, more specifically, for deterministic algorithms, the most common types of complexity estimates are

- the average-case complexity (
**expected time complexity**), in which given an input distribution, the expected time of an algorithm is evaluated - the
**almost always**complexity estimates, in which given an input distribution, it is evaluated that the algorithm admits a given complexity estimate that almost surely holds.

Read more about Probabilistic Analysis Of Algorithms: Probabilistic Algorithms, See Also

### Famous quotes containing the word analysis:

“*Analysis* as an instrument of enlightenment and civilization is good, in so far as it shatters absurd convictions, acts as a solvent upon natural prejudices, and undermines authority; good, in other words, in that it sets free, refines, humanizes, makes slaves ripe for freedom. But it is bad, very bad, in so far as it stands in the way of action, cannot shape the vital forces, maims life at its roots. *Analysis* can be a very unappetizing affair, as much so as death.”

—Thomas Mann (1875–1955)