Check whether a markov chain in naturals is recurrent

42 Views Asked by At

I have a markov chain defined in $N = \{1,2,..\}$ has transition probabillities that satisfy : $$p(n,n+1) = \frac{n}{n+2}\cdot p(n+1,n), \forall n \in N$$ Also $p(m,n) =0, \quad |m-n|>1$ and $p(n,n)=0, \quad n>1$.

Check whether the chain is recurrent.

My thoughts: This chain is irreducible since we can always go from $x$ to $y$ by taking $(y-x)$ one-length steps (which has a positive probability). Now, this is not a finite state space (is countably infinite) so I can not deduce that it is positive recurrent.
I can either find a state that is not recurrent or a one that it is and since recurrency is a class property, we are done.

Ι could define a system to solve for $E[T_x^+ | X_o=y]$ but since we do not know $p(n,n+1)$ explicitly I am not sure I can solve it. I can see :
$p(n,n+1) + p(n,n-1)=1 , \quad n>2$ and
$p(1, 1) + p(1,2) = 1$ Βut I don't these are enough to solve for $p(n, n+1)$

1

There are 1 best solutions below

3
On BEST ANSWER

A slight aside: it takes some work to show that in fact the one-length steps all have positive probability in this Markov chain. It seems possible that for some $n$, $p(n,n+1) = p(n+1,n) = 0$. However, in that case, $p(n+1,n+2) = 1$, because that's the only other possible transition from $n+1$, and then $p(n+1,n+2) = \frac{n+1}{n+3} \cdot p(n+2,n+1)$ gives us a value of $p(n+2,n+1)$ bigger than $1$. So we can eliminate this possibility.


On to the real problem. One way to figure out the answer: if the Markov chain has a stationary distribution (in addition to being irreducible), then it's positive recurrent.

In this problem, we can find a stationary distribution by solving the time-reversibility equations (also known as detailed balance): $\pi_n p(n,m) = \pi_m p(m,n)$ for all $m,n$. In this case, that just tells us $$ \pi_n \cdot p(n,n+1) = \pi_{n+1} \cdot p(n+1,n) \implies \pi_n \cdot \frac{n}{n+2} \cdot p(n+1,n) = \pi_{n+1} \cdot p(n+1,n) $$ which implies $\pi_{n+1} = \frac{n}{n+2} \cdot \pi_n$. We can rewrite this as $\pi_n \cdot n(n+1) = \pi_{n+1} \cdot (n+1)(n+2)$, so there must be some constant $C$ such that $\pi_n = \frac{C}{n(n+1)}$ for all $n$. The condition that $\sum_{n=1}^\infty \pi_n = 1$ lets us solve for $C$.

(In general, the time-reversibility equations are not guaranteed to have a solution - but when they do, they always give us a stationary distribution.)


Finally, we can solve for the explicit transition probabilities, but it takes more work.

For $n \ge 1$, let $a_n = p(n,n+1)$. Because $p(n+1,n) = 1 - p(n+1,n+2)$ for all $n \ge 1$, the identity we are given states that $a_n = \frac{n}{n+2}(1 - a_{n+1})$, or $a_{n+1} = 1 - \frac{n+2}{n} a_n$. Mathematica tells me a horrible explicit formula for $a_n$ based on this recurrence: $$ a_n = n - 2n(n+1) \Phi (-1,1,n+2) - (-1)^n \frac{n(n+1)}{2} (a_1-3+4\ln2) $$ The $\Phi(-1,1,n+2)$ is the Lerch transcendent $\sum_{k=0}^\infty \frac{(-1)^k}{(k+n+2)}$, which doesn't have a closed form - but Mathematica can tell us that in the limit as $n \to \infty$, $ n - 2n(n+1) \Phi (-1,1,n+2)$ converges to $\frac12$.

As a result, if $a_1 - 3 + 4 \ln 2$ is not $0$, $a_n$ will oscillate wildly as $n \to \infty$ and eventually leave the interval $[0,1]$. Therefore we can conclude that, for this Markov chain to be well-defined, we must have $a_1 = p(1,2) = 3 - 4 \ln 2 \approx 0.227411$.