Starting at some state $i$, we have probability of going $P_{i,i+1} = p$ and probability $P_{i,i-1} = 1-p$ what is the probability I reach N before I reach zero?
Can I convert this to a gambler's ruin problem where I start with i in bank and reach N before I reach zero?
I think the idea is to use the stochastic difference equation: $f_{i+1} - f_i = \frac{q}{p}(f_i - f_{i-1})$, where $f_0 = 0$ and $f_N = 1$ is the absorbing state and this gives rise to $f_i = \frac{1-(q/p)^i}{1-(q/p)^N}$
This gives me the probability of reaching N and not reaching zero, but does it also mean the probability of reaching N before 0? If not, how can I find that?
Your formula is correct, assuming $q=1-p$; here's argument in biological terms. This Markov chain is a birth-death process and the concept you are looking for is the fixation probability (called the absorption probability in general). Formula (1) on this page gives the probability of absorbing at $N$ starting at state $i$, where $\gamma_k = \frac{1-p}{p}$ for your process. Formula (1) simplifies to your formula by using also the formula for a finite geometric series -- the ones cancel because the sums start at 1.
The fixation probability gives the probability of the process absorbing in the state $N$ when starting from state $i$; one minus this probability gives the fixation probability for state $i=0$ (since there are no other absorbing states). Your second question, the probability of reaching $N$ before $0$, is the same since $0$ and $N$ are both (and the only) absorbing states.