let $P(Y=1)=P(Y=-1)=\frac{1}{2}$ and $X_n=\sum_{k=1}^{n}Y_k$
$T=\inf \{ n \geq 0 ; X_n = 1 \}$
I'm trying to prove $P(T<\infty)=1$
My attempt :
I want to prove the transpose $P(T=\infty)=0$ :
$$\{ T = \infty \} = \bigcap_{n=1}^{\infty} \{ X_n \neq 1\} $$ $$ =\{ X_1 \neq 1 \} \cap \left( \bigcap_{n=2}^{\infty} \{ X_n \neq 2\} \right)$$ $$ =\{ Y_1 = -1 \} \cap \left( \bigcap_{n=2}^{\infty} \{ X_n \neq 2\} \right)$$
but at this stage I find it complicated continuing to nest the different cases.
Any help?
The distribution of $T$ is actually rather complicated, so trying to compute it directly when all you actually want to do is show that it is a.s. finite is kind of a waste of effort.
One way to proceed is to solve an ostensibly harder problem: given $(a,x)$ with $-\infty<a<x \leq 1$, find the probability to hit $1$ before $a$ when starting the walk at $x$. Define this probability to be $q(x,a)$, then by conditioning on the first step of the walk you obtain a linear recurrence relation. (This uses the Markov property and the time homogeneity of the problem.) You can solve it with the BCs $q(1,a)=1,q(a,a)=0$. Then the answer to your problem is $\lim_{a \to -\infty} q(0,a)$.
Implicitly in this procedure we assume that the process almost surely hits $a$ or $1$, meaning that it does not undergo bounded oscillation forever. You should consider why that is the case. One way to prove it is again to prove something seemingly harder, namely that the expected time to hit $\{ a,1 \}$ started at $x$ is a finite number. This again satisfies a linear recurrence with boundary conditions and again you can exactly solve it.
Another way to prove that is to use the general theory of Markov chains to note that if $1$ is never visited with positive probability, then $1$ is transient, so in fact all states are transient, so $X_n \to -\infty$ as $n \to \infty$. This latter approach shows that you can apply the procedure above to any irreducible time-homogeneous Markov chain on $\mathbb{Z}$.