A Markov Chain $(X_n)_n$ has the following transition matrix:
$$P = \begin{bmatrix} 0.1 & 0.3 & 0.6\\ 0 & 0.4 & 0.6\\ 0.3&0.2&0.5 \end{bmatrix}$$ with initial distribution $\alpha = (0.2, 0.3, 0.5)$.Find $P(X_0 = 3|X_1 = 1)$ .
My solution:
$P(X_0 = 3|X_1 = 1) = \frac{P(X_0 = 3, X_1 = 1)}{P(X_1 = 1)}$
$\Rightarrow P(X_0 = 3|X_1 = 1) = \frac{P(X_1 = 1, X_0 = 3)}{P(X_1 = 1)}$
$\Rightarrow P(X_0 = 3|X_1 = 1) = \frac{P(X_1 = 1| X_0 = 3)\cdot P(X_0=3)}{P(X_1 = 1)}$
Now,
- from $P$, we have $P(X_1=1|X_0=3) = 0.30$
- from $\alpha$, we have $P(X_0=3) = 0.50$, and
- from $\alpha P$, we have $P(X_1=1) = 0.17$
So,
$P(X_0 = 3|X_1 = 1) = \frac{0.30 \cdot 0.50}{0.17} \approx 0.88$.
I have two questions regarding this:
although, I wrote, "from $P$, we have $P(X_1=1|X_0=3) = 0.30$", I have a simple confusion here. When the system is at state $X_0$, it has an initial distribution of $\alpha$. So, theoretically, shouldn't it change its state according to $\alpha$ rather than $P$? This is actually practically impossible as $\alpha$ is a $(1 \times 3)$ matrix and hence there would be no $\alpha_{(3,1)}$. But, the question remains.
is there any better/easier method than what I used here?
No, what you did is correct. Given that the chain is at $X_0 = 3$ you are interested in the transition probability to the state $1$. $\alpha(i)$ defines $\mathbf{P}(X_0 = i)$. It does not specify how the chain changes states but rather how it spontaneously starts.
Your use of Bayes' theorem is the fastest way I can imagine for that particular problem.