Let the distribution on variables $(X_t)$ satisfy a Markov Chain, where each variable can take values {1,2}.And we are given that$ p(X_1)=0.5$
given that the transition matrix P is:
$P $=
$$ \begin{matrix} 0.3 & 0.7\\ 0.6 & 0.4\\ \end{matrix} $$
1.Calculate $p(X_3=2)$. I'm not too sure if the answer is obvious here. Would I be correct to use Bayes Theorem?
2.Calculate $p(X_2=1|X_3=2)$. I got 0.6.
3.Calculate $p(X_1=1|X_3=2)$. I did $p(X_3=2|X_2=1)p(X_2=1|X_1=1) + p(X_3=2|X_2=2)p(X_2=2|X_1=1) = (0.7)(0.3) +(0.4)(0.7)= 0.49.$
4.Approximate $p(X_{1000000}=2|X_1=1)$
I know that under certain conditions, Markov chains converge to a limiting distribution, how can that help me figure this out?
Would really appreciate your help.