I'm taking a course on probability theory where learned about Markov chains. I think i grasped most of it but there's one thing i really understand yet and i've encountered in an exercise. The exercise goes as follows:
Assume a Markov Process with the following transition matrix:
$$Q = \begin{pmatrix} 0.3 &0.3&0.4\\ 0.2 &0.2&0.6\\ 0.1 &0.1&0.8 \end{pmatrix} $$
where each row represents a possible state.
The exercise asked us certain things about the limit distributions of this process and I get how all of that works but then it asked the following:
What are the odds of being in the third state in the limit and where the previous state was the first state
The problem is that i find 2 different solutions but i don't know which one is correct and why the other one is incorrect.
My first train of thought was: "What are the odds that i'm in the first state in the limit and go to the thirds" which turned out to be $\frac{1}{7}*0.4 = 0.06 $. But then it occured to me you could also look at it as being in the third state in the limit times the probability it came from state one which gives me $\frac{5}{7}*\frac{0.4}{0.4+0.6+0.8} = 0.16$.
I'm honestly more tempted to go with my second answer but I have no idea why the first one wouldn't be correct. Could somebody perheps clarify this?
The wording of the problem that was given to you to solve is a bit awkward, but your first calculation is wrong: you haven’t computed a conditional probability, which is what I suspect is really being asked for. By the same token, your second attempt doesn’t look quite right, either, for much the same reason. Note, though, that if there is a limiting distribution, then the previous state is irrelevant in the long run—all of the rows of $\lim_{n\to\infty}Q^n$ are equal.