Existence of limiting distribution of product of Markov chains

299 Views Asked by At

I have two Markov chains described by the stochastic matrices $P_1$ and $P_2$ for which a limiting distribution exists. Now I combine the two stochastic matrices using the cartesian product, this is easily done using the Kronecker product, i.e., $P = P_1 \otimes P_2$.

Now how do you prove that for $P$ there also exists a limiting distribution?

1

There are 1 best solutions below

4
On

Note that $P = P_1 \otimes P_2$ is the transition matrix of the process $(X^1,X^2)$ where $X^1$ is a Markov chain of transition matrix $P^1$, $X^2$ is a Markov chain of transition matrix $P^2$, and $X^1$ and $X^2$ are independent. Thus, if $\pi^1$ is stationary for $P^1$ and $\pi^2$ is stationary for $P^2$ then $\pi^1\otimes\pi^2$ is stationary for $P$. Recall that $(\pi^1\otimes\pi^2)(x_1,x_2)=\pi^1(x_1)\pi^2(x_2)$ for every $(x_1,x_2)$.