We have a stochastic walk on $\mathbb{N}$ with $p_{i,i-1}=1$ for $i\geq 1$ and $p_{0,i}=p_i>0$ for all $i\geq 0$. Further we have $\sum p_i=1$ and $\sum ip_i<\infty$.
How do I show that this Markov chain is recurrent?
I know that recurrent means that starting in state $i$ the chain will once return to $i$, or $\mathbb{P}(\cup_{t=1}^\infty\{X_t=i\}|X_0=i)=1$.
Further the chain is as follows, from each state we go with probability $1$ to the previous state and from $0$ we can go to any other state. So trivially, 0 is recurrent; and any other state $i$ will go to zero from which it can go to a state larger than $i$ and thus return to $i$. But how can I show this formally?
In your Markov chain, there is a non-zero probability to get from any state in $\mathbb{N}$ to any other state in $\mathbb{N}$. (Check the details). This means that your Markov chain is irreducible. If you show that one state in an irreducible Markov chain is recurrent, that means that the entire Markov chain is a recurrent chain. Since you have shown that $0$ is a recurrent state, this implies that the entire Markov chain is recurrent.
To find the steady-state distribution, try solving the global balance equations, $$ \sum_{j \in \mathbb{N} - \{i\}} \pi_i p_{i,j} = \sum_{j \in \mathbb{N} - \{i\}} \pi_j p_{j,i}. $$