Does $f_n(x) = \frac{x^n}{1+x^n}$ converges uniformly on $[0,1]$?
My answer is: No
because obviously $f_n(0) = 0$ and $f_n(1)=\frac{1}{2}$, so for every $n\in\mathbb{N}$ it's true that for $\varepsilon = \frac{1}{4}$: $f_n(1)-f_n(0) > \varepsilon$.
Is this question stupid or am I?
Basically the summary is this,
$$\lim_{n \to \infty} f_n(x) = \left\{\begin{matrix} 0 & x\in [0, 1)\\ \frac{1}{2}& x = 1 \end{matrix}\right.$$
Each $f_n$ is continuous on $[0,1]$, yet the limit is ....
Also to correct your logic, consider this example. Let $g_n(x) = x^3 + 1/n$, then $$g_n(x) \rightrightarrows x^3 = g(x).$$
Now $g_n(1) = 1 + 1/n$ and $g_n(0) = 1/n$, but according to your logic we would have $g_n(0) - g_n(1) = 1 \nleq \epsilon = 1/4.$