communicating the idea of stationary distribution and ergodic theorem

48 Views Asked by At

I will teach the Markov chain for the first time to someone who has not much background in it, and I have little, but I am trying to read and understand intuitively to convey the idea in layman's terms.

Consider an irreducible, positive recurrent Markov chain with $3$ states with unique stationary probability distribution $\pi$ such that $\pi(1)=\frac{1}{2}=\pi_1, \pi(2)=\frac{1}{3}=\pi_2, \pi(3)=\frac{1}{6}=\pi_3$. Let $v_i(n)$ denote the number of visits of state $i$ before time $n$, and a well-known theorem says the following:

$$\mathbb P\left[\lim_{n\to \infty}\frac{v_i(n)}{n}\to\pi_i\right]=1.$$

  1. Can I say to the target audience that, by a stationary probability distribution, they can think that if they watch the evolution of the Markov chain for a long-time let's say for $10$ ( or should I say $100$) hours, considering discrete time step as minutes. They will see $\frac{1}{2}\times 10$ hours they will see the chain is visiting state $1$, $\frac{1}{3}\times 10$ hours in-state $2$ and $\frac{1}{6}\times 10$ hours in-state $3$. Is that the correct way to convey the intuition behind stationary probability distribution? They may ask in which order of hours they will see this event.

  2. And, the above limit precisely conveys the fact I just said in point $1$, with probability $1$.

Please let me know and suggest if this can be made far easier to convey.

  1. Another thing: if $f:\{1,2,3\}\to \mathbb R$ is any bounded function let say $f(x)=e^x$, ergodic theorem says $$\mathbb P\left[\lim_{n\to \infty}\frac{1}{n}\sum\limits_{k=0}^{n-1}f(X_k)\to\sum\limits_{i\in\{1,2,3\}}f(i)\pi_i\right]=1.$$

I am not sure, what is the intuition to bring $f$ here and how can I convey the usefulness of this result to a general audience without much background to the Markov chain.

Thank you.