Two-state Markov Chains

85 Views Asked by At

If I have a two-state Markov chian $V(t)$ with transition probabilities:

$P_{00}(t)=(1-\pi) + \pi e^{-\tau t}$

$P_{01}(t)= \pi - \pi e^{-\tau t}$

$P_{10}(t)=(1-\pi) - (1-\pi)e^{-\tau t}$

$P_{11}(t)= \pi + (1-\pi)e^{-\tau t}$

The initial distribution is $(1-\pi,\pi)$ and assuming $Pr(V(0)=0)=1- \pi$ and $Pr(V(0)=1)=\pi$

I'm trying to figure out how to show that $Pr(V(t)=1)=\pi$ for all t>0. My thinking is that I need to use the fact that $$\lim_{t\rightarrow\infty}P_{01}(t)$$=$$\lim_{t\rightarrow\infty}P_{11}(t)$$= $\pi$

Am I correct in my thinking?