Skip to main content

Weak Law of Large Numbers

Definition​

The Weak Law of Large Numbers (WLLN) states that, for a sequence of independent and identically distributed (i.i.d.) random variables X1,X2,X3,…,XnX_1,X_2,X_3,…,X_n​ with a finite mean ΞΌ\mu and variance Οƒ2\sigma^2, the sample average

XΛ‰n=1n(X1+X2+…+Xn)\bar{X}_n=\frac{1}{n}(X_1+X_2+…+X_n)

converges in probability towards the expected value μ\mu as nn approaches ∞\infty.

In mathematical terms, for any positive number Ο΅\epsilon,

P(∣XΛ‰nβˆ’ΞΌβˆ£<Ο΅)β†’1Β asΒ nβ†’βˆž.(1)P(|\bar{X}_n - \mu|<\epsilon)\rightarrow 1\text{ as } n \rightarrow \infty.\tag{1}

This can also be written as

lim⁑nβ†’βˆžXΛ‰nβ†’pΞΌ\lim_{n\to\infty} \bar{X}_n\xrightarrow{p}\mu

where β†’p\xrightarrow{p} denotes convergence in probability.

It's important to note that, for any finite sample size, there's still a non-zero probability that the sample mean will significantly differ from the population mean. It does not guarantee that the sample mean will equal the population mean, only that it will get arbitrarily close to it as the sample size increases.

Another simplified notation of (1)(1) is

plim Xˉn=μ.\text{plim }\bar{X}_n = \mu.

Proof​

Using Chebyshev's Inequality of the Sample mean, we know that

P(∣XΛ‰nβˆ’E[X]∣β‰₯m)≀σ2m2n.P(|\bar{X}_n - \mathbb{E}[{X}]|\geq m)\leq \frac{\sigma^2}{m^2n}.

Since we are dealing with probability, above expression can be written as

0≀P(∣XΛ‰nβˆ’E[X]∣β‰₯m)≀σ2m2n.0\leq P(|\bar{X}_n - \mathbb{E}[{X}]|\geq m)\leq \frac{\sigma^2}{m^2n}.

We know that

lim⁑nβ†’βˆžΟƒ2m2n=0\lim_{n\to \infty}\frac{\sigma^2}{m^2n}=0

as Οƒ2\sigma^2 and m2m^2 are constants, therefore

lim⁑nβ†’βˆžP(∣XΛ‰nβˆ’E[X]∣β‰₯m)=0,β€…β€ŠβŸΉβ€…β€Šlim⁑nβ†’βˆž1βˆ’P(∣XΛ‰nβˆ’E[X]∣<m)=0,β€…β€ŠβŸΉβ€…β€Šlim⁑nβ†’βˆžP(∣XΛ‰nβˆ’E[X]∣<m)=1.β– \begin{align*} \lim_{n\to \infty}P(|\bar{X}_n - \mathbb{E}[{X}]|&\geq m)=0,\\ \implies \lim_{n\to \infty}1-P(|\bar{X}_n - \mathbb{E}[{X}]|&< m)=0,\\ \implies \lim_{n\to \infty}P(|\bar{X}_n - \mathbb{E}[{X}]|&< m)=1. \hspace{20px}\blacksquare\\ \end{align*}