The expected number of visits before hitting zero in simple random walk

2.8k Views Asked by At

I am learning Markov chains and encounter the following problem: Suppose in simple random walk, we start from state k. What's the expected number of visits to k before we hit 0? The book does not mention if the time 0 counts for one visit, but I assume it does.

This question has a first part:The expected number of visits to k between two successive visits to 0 is 1 for all k. The way this part is solved is by defining a stationary measure mu(x)= starting from 0, the expected number of visits to x before we hit 0 again since the starting state 0 is recurrent. This is Theorem 6.5.2 in Rick Durrett's book. Thm 6.5.3 in the book asserts that if the chain is irreducible and recurrent then the stationary measure is unique up to constant multiples. So mu is constant. But mu(0)=P0(0)=1 which implies mu(k)=1 for all k which is not 0.

Thank you.

1

There are 1 best solutions below

2
On

Here is the start of a solution:

Let $p_i$ be the probability that when starting from state $i$, we hit state $k$ before we hit state $0$.

Clearly, $p_i = 0$ for $i \le 0$ and $p_i = 1$ for $i \ge k$.

If our current state is $i$ where $0 < i < k$, then we have a $\tfrac{1}{2}$ chance of taking a step to $i-1$ and a $\tfrac{1}{2}$ chance of taking a step to $i+1$. Therefore $p_i = \dfrac{p_{i-1}+p_{i+1}}{2}$ for $0 < i < k$.

Rearrange that equation to get $p_{i} - p_{i-1} = p_{i+1} - p_{i}$. So, $p_i$ grows linearly from $i = 0$ to $i = k$.

Now, you should know what $p_i$ is for $0 < i < k$. So, what is the probability of starting from $k$ and reaching $0$ without hitting $k$ after taking one step? From answering that question, you can get the answer to the problem.