Given an invariant distribution is the (finite state) Markov transition matrix unique?

1.1k Views Asked by At

Doeblin's theorem states that for a given transition probability matrix there exists a unique invariant distribution for that chain.

Is the converse true as well? Can two (finite state, discrete) Markov Chains have the same invariant distribution, but have different transition matrices?

I don't think so, but I have only tried a proof by contradiction:

Assume there are two transition matrices, P and Q such that $\pi$ P = $\pi$ and $\pi$Q = $\pi$ ($\pi$ is the invariant distribution). Then $\pi$ P = $\pi$Q and P=Q (contradiction).

However, this doesn't really tell me why this should be true.

1

There are 1 best solutions below

0
On BEST ANSWER

Unfortunately your proof contains an error, $\pi P = \pi Q$ does not imply $P=Q$. Consider for example $$ P = \begin{pmatrix} 0.5 & 0.5 & 0 \\ 0.5 & 0 & 0.5 \\ 0 & 0.5 & 0.5 \end{pmatrix}$$ and $$ Q = \begin{pmatrix} 0.33 & 0.33 & 0.33 \\ 0.33 & 0.33 & 0.33 \\ 0.33 & 0.33 & 0.33 \end{pmatrix}.$$ These both have the stationary distribution $\begin{pmatrix} 0.33 & 0.33 & 0.33 \end{pmatrix}$, but are two different matrices.

If you consider the elements of the system algebraically you can see the system has many degrees of freedom. You might find the answer to this related question helpful.