$A$ starts with $i$ coins, $B$ with $N-i$. At each trial, $A$ gives one coin to $B$ with probability $p$ or $B$ gives one coin to $A$ with probability $q$ where $p+q=1$.
This can be modeled as a 2D random walk starting from $i$ where probability of moving right = $p$, left = $q$, and the walk ends at reaching either $0$ or $N$
Nothing in this statement seems to say that oscillating around i and never reaching either $0$ or $N$ is not a possibility. However, doing the following calculation, something seems off.
Let $p_i$ be the probability that A will end up with all money, that is, the object will reach N, when starting position is $i$. $$p_i = p*p_{i+1} + q*p_{i-1}$$ Solving this difference equation gives $$p_i = \frac{1-(\frac{q}{p})^i}{1-(\frac{q}{p})^N} $$ for $p\neq q$
Now, P(reaching N starting from $i$) = $p_i$ By symmetry, P(reaching 0 starting from $i$) = $\frac{1-(\frac{p}{q})^{N-i}}{1-(\frac{p}{q})^N}$
Adding those together equals 1. Which means either A wins or B wins. That is no probability left for just oscillating around $i$ and never reaching either $0$ or $N$. Why is that so?
The probability for the event, going from i to i+1, then back to i, then i+1 and so on = $p*q*p*q*... = (pq)^n$ where n can go up to infinity. However small, this is a positive number. And unless n goes to infinity, it is greater than zero.
I understand that $\lim_{n \to \infty} (pq)^n = 0$. But how is that applicable here. As $n \to \infty$, probability $\to 0$. But $n$ is always less than $\infty$, so probability is always greater than $0$. Please tell me if I am wrong with my interpretation of limits.
Is it correct that either A or B has to win? Why?
This seems to be a question about the meaning of limits. I think the following much simpler example captures all the issues.
Suppose you flip a fair coin again and again, and consider the event that all results are Heads. Clearly $\text{Prob}(\text{First } N \text{ are all heads}) = p_N = {1 \over 2^N}$. The following statements are all true and not contradictory:
For any finite $N, p_N > 0$.
$\lim_{N \rightarrow \infty} \ p_N = 0$.
Suppose you lose when the first Tail shows up. The event $E$ that you don't lose is non-empty, i.e. $E \neq \emptyset$, or equivalently, there exists a sample point (namely, all Heads forever) where you don't lose. However $P(E) = 0$ (or equivalently, $P(lose) = 1$).
$E \neq \emptyset$ and yet $P(E)=0$ is not that surprising. After all, the sample space contains an infinite number of sample points (all infinitely-long sequences of H/T, i.e. $\{H,T\}^\infty$). This kind of thing happens all the time with continuous random variables. E.g. if $X = \text{Uniform}(0,1)$ then $P(X=0.1) = 0$ even though it is clearly a non-empty event.
Back to your example, the analogous $E = $ the set of sample points where the random walk never reaches either boundary (more precisely, $\forall t$ the position at $t \neq$ either boundary). Clearly $E \neq \emptyset$, but you have also shown that $P(E)= 0$.
Does this help, or at least, help to distill the issue?