is ergodicity just the law of large numbers?

385 Views Asked by At

There are various laws of large numbers. They have the form

LLN. If [conditions on $X_i$] then $\lim_{n\to\infty}\frac 1 n \sum_{i=1}^n X_i=\mathbb E [X_i]$

Is it correct to say that $X_i$ satisfies ergodicity if $\lim_{n\to\infty}\frac 1 n \sum_{i=1}^n X_i=\mathbb E [X_i]$?

In other words, that ergodicity is just the conclusion you get if you apply the law of large numbers? I know that ergodicity is formalized differently, but I am trying to connect it to something I know.

1

There are 1 best solutions below

5
On BEST ANSWER

No. It's possible for a process $\ \left\{X_i\right\}_{i=1}^\infty\ $ to be non-ergodic but still satisfy the condition $\ \displaystyle\lim_{n\to\infty}\frac 1 n \sum_{i=1}^n X_i=\mathbb E [X_i]\ $.

If $\ \left\{Y_i\right\}_{i=1}^\infty\ $ is a sequence of independent random variables such that $\ Y_{2i-1}, i=1,2,\dots\ $ have identical distribution $\ F\ $, say, and $\ Y_{2i}, i=1,2,\dots\ $ have identical distribution $\ G\ne F\ $, but with $\ \mathbb E [Y_{2i-1}]=\mathbb E [Y_{2i}]\ $, and $\ X_i = Y_{2i-1}\ $ for all $\ i\ $ with probability $\ p\in \left(0,1\right)\ $, and $\ X_i = Y_{2i}\ $ for all $\ i\ $ with probability $\ 1-p\ $, then $\ \displaystyle\lim_{n\to\infty}\frac 1 n \sum_{i=1}^n X_i=\mathbb E [X_i]\ $ with probability $1$, but $\ \left\{X_i\right\}_{i=1}^\infty\ $ is not ergodic.

Why $\ \left\{X_i\right\}_{i=1}^\infty\ $ is not ergodic.

One of the consequences of ergodicity is that $$ \lim_{n\to\infty}\frac 1 n \sum_{i=1}^n f\left(X_i\right)=\mathbb E[f\left(X_i\right)]\ $$ with probability $1$, not only for $\ f(x)\equiv x\ $, but also for $\ f\ $ any measurable function for which $\ E[\,\left\vert f\left(X_i\right)\right\vert\,] <\infty\ $. For a process $\ \left\{Y_i\right\}_{i=1}^\infty\ $ satisfying the conditions given above there must be some measurable set $\ A\ $ such that $\ \mathbb P\left(Y_{2i-1}\in A\right) \ne \mathbb P\left(Y_{2i}\in A\right) \ $. Let $\ f:\mathbb R\rightarrow \mathbb R\ $ be defined by $$ f(x)=\cases{1 & if $\ x\in A\ $,\\ 0 & if $\ x\not\in A\ $.} $$ Then $\ \mathbb E[f\left(X_i\right)]=p\,\mathbb P\left(Y_{2i-1}\in A\right) + (1-p)\,\mathbb P\left(Y_{2i}\in A\right)\ \ $, but with probability $\ p\ $, $$ \lim_{n\to\infty}\frac 1 n \sum_{i=1}^n f\left(X_i\right)= \mathbb P\left(Y_{2i-1}\in A\right)\ne \mathbb E[f\left(X_i\right)]\ , $$ and with probability $\ 1-p\ $, $$ \lim_{n\to\infty}\frac 1 n \sum_{i=1}^n f\left(X_i\right)= \mathbb P\left(Y_{2i}\in A\right)\ne \mathbb E[f\left(X_i\right)]\ . $$