I have a discrete Markov Chain on the integers. From state $i$ I have a probability $\mu$ to go to $i-1$ and probability $\lambda$ to go to $i+1$, where $\mu+ \lambda=1$.
I am asked to calculate the probability that starting from $0$ we ever hit $i \ge 1$.
I tried the same approach as for the gambler ruin problem but it does not seem to work as I cannot stop at $0$.
I wrote $h(j)$ as the probability of hitting i starting from j. Then I know the following set of equations are true:
$h(i)=1$ and $h(j)=\mu h(j-1)+\lambda h(j+1)$.
However, I don't seem to find another condition to impose to get a set of solutions.