Probability returning to initial state

554 Views Asked by At

enter image description here

Let $P=\begin{bmatrix}0&\frac{1}{2}&\frac{1}{2}\\\frac{1}{2}&0&\frac{1}{2}\\\frac{1}{2}&\frac{1}{2}&0\end{bmatrix}$ and $P^{(n+1)}=P^{(n)}P.$

I know that if you start in any vertex the probability of return after $n$ hops is the same $P_{11}^{(n+1)}=P_{22}^{(n+1)}=P_{33}^{(n+1)}$ then $$p_{11}^{(n+1)}=\frac{1}{2}p_{12}^{(n)}+\frac{1}{2}p_{13}^{(n)}$$ since $p_{12}^{(n)}+p_{13}^{(n)}=1$ $$p_{11}^{(n+1)}=\frac{1}{2}\left(1-\frac{1}{2}p_{12}^{(n)}\right)$$ but I do not know how to solve this recurrence relation or if is right.

2

There are 2 best solutions below

0
On BEST ANSWER

Some related summary comments on ergodic Markov chains.

(There are several fleas hopping around triangles elsewhere on this site, each according to the same transition matrix, but with varying questions about the associated process; search 'Markov flea vertex'.)

With help from @Did, essentially by manipulating difference equations, I believe you have understood that $$\lim_{n \to \infty} p_{11}^{(n)} = \lim_{n \to \infty} [1/3 + (-1/2)^n(2/3)] = 1/3.$$ The purpose of this comment is to put that accomplishment in context.

Indeed, $\lim_{n \to \infty} p_{ji}^{(n)} = 1/3,$ for all $i, j.\;$ Or, in matrix notation, $\text{P}^n$ converges to a $3 \times 3$ matrix in which all 9 entries are $1/3.$ Over the long run, the starting vertex is unimportant, and the chain spends 1/3 of its 'time' at each vertex.

This is an 'ergodic' Markov chain. That is, there is a power $N$ such that $\text{P}^N$ has all positive entries. (In this case $N = 2$.) Such an ergodic chain has a unique stationary distribution $\sigma$ (a 3-vector) such that $\sigma\text{P} = \sigma$ which is the same as the limiting distribution (a 3-vector) with elements $\pi_i = \lim_n p_{ji}^{(n)}.$ That is, each of the three rows of the matrix $\lim_n \text{P}$ is the 3-vector $\sigma = \pi.$

The particular transition matrix $\text{P}$ of the Markov chain of the current problem is 'doubly stochastic'; its columns (as well as its rows) sum to 1. A Markov chain with such a $k \times k$ transition matrix has a stationary $k$-vector $\sigma$ with all elements equal. If such a chain is ergodic then the limiting matrix $\lim_n \text{P}$ has all $k^2$ elements equal.

However, a chain with a doubly-stochastic matrix need not be ergodic. For example, if the flea hops clockwise or counter-clockwise (equal probability) to adjacent vertices of a SQUARE, then the process is periodic with period 2. There is no power of the transition matrix with all positive elements.

Some computations using R (other code or software may be more elegant):

 p = 1/2;  P = matrix(c(0,p,p,  p,0,p,  p,p,0), byrow=T, nrow=3);  P
 ##     [,1] [,2] [,3]
 ##[1,]  0.0  0.5  0.5
 ##[2,]  0.5  0.0  0.5
 ##[3,]  0.5  0.5  0.0

 P2 = P %*% P;  P2                    # 2nd power all positive elements
 ##     [,1] [,2] [,3]
 ##[1,] 0.50 0.25 0.25
 ##[2,] 0.25 0.50 0.25
 ##[3,] 0.25 0.25 0.50

 P4 = P2 %*% P2; P8 = P4 %*% P4;  P8  # 8th power already close to limit
 ##          [,1]      [,2]      [,3]
 ##[1,] 0.3359375 0.3320313 0.3320313
 ##[2,] 0.3320313 0.3359375 0.3320313
 ##[3,] 0.3320313 0.3320313 0.3359375
 1/3 + (-1/2)^8 * 2/3                 # matches diagonal elements
 ## 0.3359375


 # Normed first eigen vector
 eig.vec = as.real(eigen(t(P))$vectors[,1]);  eig.vec/sum(eig.vec)
 ## 0.3333333 0.3333333 0.3333333
3
On

Transition probability matrix (P) in this case will be $$ P=\begin{bmatrix}0 & 0.5 & 0.5\\ 0.5 & 0 & 0.5\\ 0.5 & 0.5 & 0\end{bmatrix}. $$ Though $p_{ii}=0$ for $i=0,1,2$, it can be observed that $p_{ii}^n\ne0$ for $n\ge 2$ (also intuitive). Rest is a straight forward calculation depending on $n$.