We need to show that for every $\varepsilon>0$, $\exists N \in \mathbb R$ such that $n,m > N \implies |n-m|<\varepsilon$.
$|n-m|<n+m$. So, if we make $n+m< \varepsilon$, the result will follow. This will happen if both $n<\varepsilon/2$ and $m<\varepsilon/2$.
We see that $n>-\varepsilon/2$ and $m>-\varepsilon/2$. So, let $N=-\varepsilon/2$. Then $$n,m > N \implies |n-m|<n+m<\varepsilon/2+\varepsilon/2<\varepsilon$$
I know that $\{n\}$ diverges but the steps of the proof seem logical.
Where is the fallacy?
The error in your proof is the assumption $n<\varepsilon/2$ (and similarly for $m$). This condition cannot be satisfied for large $n$, so you cannot deduce $|n-m|<\epsilon$ for all large $m$ and $n$.