Probability Theory - Convergence of Random Variables

Convergence of Random Variables

In probability theory, there are several notions of convergence for random variables. They are listed below in the order of strength, i.e., any subsequent notion of convergence in the list implies convergence according to all of the preceding notions.

Weak convergence: A sequence of random variables converges weakly to the random variable if their respective cumulative distribution functions converge to the cumulative distribution function of, wherever is continuous. Weak convergence is also called convergence in distribution.
Most common short hand notation:
Convergence in probability: The sequence of random variables is said to converge towards the random variable in probability if for every ε > 0.
Most common short hand notation:
Strong convergence: The sequence of random variables is said to converge towards the random variable strongly if . Strong convergence is also known as almost sure convergence.
Most common short hand notation:

As the names indicate, weak convergence is weaker than strong convergence. In fact, strong convergence implies convergence in probability, and convergence in probability implies weak convergence. The reverse statements are not always true.

Read more about this topic:  Probability Theory

Famous quotes containing the words random and/or variables:

    Novels as dull as dishwater, with the grease of random sentiments floating on top.
    Italo Calvino (1923–1985)

    The variables are surprisingly few.... One can whip or be whipped; one can eat excrement or quaff urine; mouth and private part can be meet in this or that commerce. After which there is the gray of morning and the sour knowledge that things have remained fairly generally the same since man first met goat and woman.
    George Steiner (b. 1929)