Markov chains. If $X_0=0$ then the probability that $X_n\ge1$ for all $n\ge 1$ is $\frac{6}{\pi^2}$

1.5k Views Asked by At

Let $(X_n)_{n\ge0}$ be a markov chain on $\{0,1,...\}$ with transition probabilities given by

$p_{01}=1,p_{i,i+1}+p_{i,i-1}=1, p_{i,i+1}=\Big(\frac{i+1}{i}\Big)^2p_{i,i-1}, i\ge1$

I need to show that if $X_0=0$ then the probability that $X_n\ge1$ for all $n\ge 1$ is $\frac{6}{\pi^2}$

I'm looking for tips on how to solve this task.

1

There are 1 best solutions below

0
On BEST ANSWER

Here is a rather long answer but it has the merit of being quite general I think.

Minimality of the hitting probability

In the general case of a Markov chain with state space $\mathbb{N}$ and transitions $(p_{i,j})_{(i,j)\in\mathbb{N}^2}$, consider $\mathcal{P} \subset \mathbb{N}$ a subset of states and for all $i \in\mathbb{N}$, $(X_n(i))$ a Markov chain starting from $i$ and the hitting probability $h_{\mathcal{P}}(i) = \mathbb{P}\big(\inf\{n\ge 0, X_n(i) \in \mathcal{P}\} < +\infty\big)$. Now we have that $$h_{\mathcal{P}} \mbox{ is the minimal non negative solution to }\quad h(i)=\left\{\begin{array}{ll} 1 \mbox{ if } i \in \mathcal{P}\\ \sum \limits_{j \in \mathbb{N}} p_{i,j}h(j) \mbox{ otherwise.}\end{array}\right.$$

Indeed, $h_{\mathcal{P}}$ is a solution. Conversely, if $h$ is a solution, for $i \in \mathbb{N} \backslash \mathcal{P}$, $h(i) = \sum \limits_{j \in \mathcal{P}} p_{i,j} + \sum \limits_{j \in \mathbb{N}\backslash \mathcal{P}} p_{i,j}h(j)$. Plugging it back, we also get $h(i) = \sum \limits_{j \in \mathcal{P}} p_{i,j} + \sum \limits_{j \in \mathbb{N}\backslash\mathcal{P}}p_{i,j}\sum \limits_{k \in \mathcal{P}}p_{j,k} + \sum \limits_{j \in \mathbb{N}\backslash\mathcal{P}}p_{i,j} \sum \limits_{k \in \mathbb{N}\backslash \mathcal{P}} p_{j,k}h(k)$. Repeat several times and note that the last term is non negative: you get $$h(i) \ge \mathbb{P}(X_1 \in \mathcal{P})+\mathbb{P}(X_1 \notin \mathcal{P},X_2\in\mathcal{P})+\cdot\cdot\cdot+\mathbb{P}(X_1\notin\mathcal{P},...,X_{n-1}\notin\mathcal{P},X_n\in\mathcal{P}).$$ Taking the limit, $h(i) \ge h_{\mathcal{P}}(i)$ using the definition of $h_{\mathcal{P}}$.

$ $

Back to your problem

I will change a bit your problem. We are going to see $0$ as a well, and the random walk starts from $1$. So $X_0=1$ and $p_{0,0} = 1$. Nothing else is changed.

In your problem the hitting states are just $\mathcal{P} = \{0\}$, the transition are given as you stated. Let us define and compute the sequence $u_n = h_{\mathcal{P}}(n)$, the probability of hitting $0$ starting from $n$. We have $u_0 = 1$ (if you start in the well $0$, you hit it) and for all $n \ge 1$, the law of total probability yields $u_n = p_{n,n-1}u_{n-1}+p_{n,n+1}u_{n+1} = \frac{n^2}{n^2+(n+1)^2}u_{n-1}+\frac{(n+1)^2}{n^2+(n+1)^2}u_{n+1}$. Regrouping gives you $(n+1)^2 (u_{n+1}-u_n) = n^2 (u_n-u_{n-1})$ so $u_{n+1}-u_n = \Big(\prod \limits_{k=1}^n \frac{k^2}{(k+1)^2}\Big)(u_1-u_0)=\frac{1}{(n+1)^2}(u_1-1)$.

Summing, we get $u_n-u_0 = \sum \limits_{k=0}^{n-1} \frac{1}{(k+1)^2}(u_1-1)$ i.e. $$u_n = 1 - (1-u_1) \sum \limits_{k=1}^n \frac{1}{k^2}.$$

Now use the first section: among all possible choices of $u_1$, it has to be the one that makes $(u_n)$ minimal and non negative. Remind that $\sum \limits_{k=1}^n \frac{1}{k^2} \underset{n \to \infty}{\longrightarrow} \frac{\pi^2}{6}$. If we had $u_1 < 1-\frac{6}{\pi^2}$, $(u_n)$ would not be non-negative, and with $u_1 > 1-\frac{6}{\pi^2}$ it would not be the minimal solution (because a $u_1'$ between $1-\frac{6}{\pi^2}$ and $u_1$ would give a smaller solution).

That gives you $u_1 = 1-\frac{6}{\pi^2}$. The probability of going back to zero starting from $1$ is $1-\frac{6}{\pi^2}$. Additionnally, we found that the probabilty of going back to zero starting from $n$ is $1 - \frac{6}{\pi^2} \sum \limits_{k=1}^n \frac{1}{k^2}$.