A Markov chain returning time $τ_{ret}(i)$ is defined as the minimum steps that a Markov chain returns to state $i$ after leaving $i$. Here is an interesting process. Suppose that we have a random walk in the countable set of non-negative numbers $\Omega = {0, 1, 2, . . . , }$. At each step, the Markov chain state $x_t = n$, it has probability $\alpha$ to go up (i.e. $x_{t+1} = n+1$) and probability $1−\alpha$ to return to $x_{t+1} = 0$. So, $$x_{t+1}=\begin{cases} n+1, \text{ with probability }\alpha \\ 0, \quad \; \; \text{ with probability } 1- \alpha \end{cases}$$
The questions are:
(1) How to calculate the probability for returning to state 0 in finite step, that is,
$$\Pr(\tau_{ret}(0)<\infty) = \sum\limits_{\tau(0)=1}^{\infty} \Pr(\tau(0)).$$
(2) Then, how to calculate the expected return time
$$\mathbb{E}[\tau_{ret}(0)].$$
Any suggestion or answers would be greatly appreciated!
The probability to return in finite time is $0$ if $\alpha=1$ and $1$ otherwise, since the probability not to have returned after time $t$ is $\alpha^t$, with limit $1$ or $0$ for $t\to\infty$ according as $\alpha=1$ or $\alpha\lt1$.
The walk has the same probability $1-\alpha$ to return to the origin in every step. The expected number of trials until the first success with success probability $p$ is $\frac1p$, so the expected time to return to the origin is $\frac1{1-\alpha}$.