Stochastic convergence

Stochastic convergence is a mathematical concept intended to formalize the idea that a sequence of essentially random or unpredictable events sometimes is expected to settle into a pattern.

The pattern may for instance be
 * Convergence in the classical sense to a fixed value, perhaps itself coming from a random event
 * An increasing similarity of outcomes to what a purely deterministic function would produce
 * An increasing preference towards a certain outcome
 * An increasing "aversion" against straying far away from a certain outcome

Some less obvious, more theoretical patterns could be
 * That the probability distribution describing the next outcome may grow increasingly similar to a certain distribution
 * That the series formed by calculating the expected value of the outcome's distance from a particular value may converge to 0
 * That the variance of the random variable describing the next event grows smaller and smaller.

Various possible modes of stochastic convergence
Four different varieties of stochastic convergence are noted:


 * Almost sure convergence
 * Convergence in probability
 * Convergence in distribution
 * Convergence in rth order mean

Almost sure convergence
This is the type of stochastic convergence that is most similar to pointwise convergence known from elementary real analysis.

Informal description of almost sure convergence
We are confronted with an infinite sequence of random experiments: Experiment 1, experiment 2, experiment 3 ..., where the outcome of each experiment will generate a real number. The random experiments will thus generate a sequence of real numbers,  typically denoted x1, x2, x3... .

If we have formulas available that describe the probabilities involved in each experiment, then we may say something about the probability that this sequence will converge to a given value.

If this probability is 1, then the phenomenon of "almost sure convergence" is present.

Note that in advanced treatments the outcomes are not restricted to real numbers.

Formal definition
Let X0, X1, X2... be an infinite sequence of random variables defined over a subset of R.

Then the actual outcomes will be an ordinary sequence of real numbers.

If the probability that this sequence will converge to a given real number a equals 1, then we say the original sequence of stochastic variables has almost sure convergence to a.

In more compact notation:


 * If $$P(\lim_{i \to \infty} X_i = a) = 1 $$ for some a, then the sequence has almost sure convergence to $$a$$.

Note that we may replace the real number a above by a real-valued function $$f(i)$$ of i, and obtain almost sure convergence to a function rather than a fixed number.

The number a may also be the outcome of a random variable X. In that case the compact but somewhat confusing notation $$P(\lim_{i \to \infty} X_i = X) = 1 $$ is often used.

Commonly used notation: $$X_i \stackrel{a.s.}{\rightarrow} a $$, $$X_i \stackrel{a.s.}{\rightarrow} X $$.

Convergence in probability
The basic idea is that the probability of an "unusual" outcome becomes smaller and smaller as the sequence progresses.

Formal definition
Let $$\scriptstyle X_0, X_1, ... $$ be an infinite sequence of random variables defined over a subset of R.

If there exists a real number a such that $$\lim_{i \to \infty} P( |X_i - a| > \varepsilon) = 0 $$ for all $$\varepsilon >0$$, then the sequence has convergence in probability to a.

Commonly used notation: $$X_i \stackrel{P}{\rightarrow} a$$.

Convergence in distribution
With this mode of convergence, we increasingly expect to see our next outcome in a sequence of random experiments becoming better and better modeled by a given probability distribution.

Formal definition
Given a random variable X with a cumulative distribution function F(x), let $$X_i$$ be a sequence of random variables, each with cumulative distribution function $$F_i (x)$$, respectively.

If $$\scriptstyle \lim_{i \to \infty} F_i (x) = F(x)$$ for all x where F(x) is continuous, then the sequence $$X_i$$ of stochastic variables converges in distribution to the distribution of $$X$$.

Commonly used notation: $$X_i \stackrel{L}{\rightarrow} X$$. One can also use the distribution directly, so if f.i. X is normally distributed with mean 0 and variance 1,  one could write $$X_i \stackrel{L}{\rightarrow} N(0,1)$$.

Convergence in r-th order mean
This is a rather "technical" mode of convergence. We essentially compute a sequence of real numbers, one number for each random variable,  and check if this sequence is convergent in the ordinary sense.

Formal definition
If $$\scriptstyle \lim_{n \to \infty} E(|X_n - a|^r )=0$$ for some real number a,  then {$$X_n$$} converges in r-th order mean to a.

Commonly used notation: $$X_n \stackrel{L_r}{\rightarrow} a$$.

Relations between the different modes of convergence

 * If a sequence of random variables has almost sure convergence, then it also has convergence in probability.
 * If a sequence of random variables has convergence in probability, then it also has convergence in distribution.
 * If a sequence of random variables has convergence in (r+1)-th order mean, then it also has convergence in r-th order mean (r>0).
 * If a sequence of random variables has convergence in rth order mean, then it also has convergence in probability.

Related topics

 * Probability
 * Probability theory
 * Random variable
 * Stochastic process
 * Time series
 * Stochastic differential equation
 * Stochastic modeling