We have a random walk on the integers with probability of going to the right is $\lambda$ and to the left is $\mu$. Suppose we start at 0. I want to find the probability of ever hitting a fixed state $i$. I actually found the following question on MSE: Exercise 2.7.1 of J. Norris, "Markov Chains"
I tried to solve it with the hints in that post, but I did not succeed in solving the problem. The hint was to compute $P_0(T_i<T_n)$ where $i\geq 1$, $n\leq -1$, $T_i$ is the first passage time to $i$ and the same for $T_n$. Then if this is calculated, take the limit $n\rightarrow -\infty$ to find the asked probability. What I tried:
$P_0(T_i<T_n)=\mu P_0(T_{i+1}<T_{n+1})+\lambda P_0(T_{i-1}<T_{n-1})$ by the Markov property if we do the first step. Now I believe we can do this repeatedly, but for me it seems this will be very complicated to find value in the end. I really need help on this one, I have been thinking for a long time now.