Probability Theory - Convergence of Random Variables

Convergence of Random Variables

In probability theory, there are several notions of convergence for random variables. They are listed below in the order of strength, i.e., any subsequent notion of convergence in the list implies convergence according to all of the preceding notions.

Weak convergence: A sequence of random variables converges weakly to the random variable if their respective cumulative distribution functions converge to the cumulative distribution function of, wherever is continuous. Weak convergence is also called convergence in distribution.
Most common short hand notation:
Convergence in probability: The sequence of random variables is said to converge towards the random variable in probability if for every ε > 0.
Most common short hand notation:
Strong convergence: The sequence of random variables is said to converge towards the random variable strongly if . Strong convergence is also known as almost sure convergence.
Most common short hand notation:

As the names indicate, weak convergence is weaker than strong convergence. In fact, strong convergence implies convergence in probability, and convergence in probability implies weak convergence. The reverse statements are not always true.

Read more about this topic:  Probability Theory

Famous quotes containing the words random and/or variables:

    Novels as dull as dishwater, with the grease of random sentiments floating on top.
    Italo Calvino (1923–1985)

    Science is feasible when the variables are few and can be enumerated; when their combinations are distinct and clear. We are tending toward the condition of science and aspiring to do it. The artist works out his own formulas; the interest of science lies in the art of making science.
    Paul Valéry (1871–1945)