Consider the series sigma (from k=1 to infinity) (-1)^k.(x+k/k^2) for x within [0,1].
I deduce that the series (-1)^k.(x/k^2) is uniformly convergent from using the M-test for convergence, and I assume that the series (-1)^k.(1/k) is not uniformly convergent because it has no x term so isn't a series of functions satisfying the definition of uniform convergence of series. (I know that it is conditionally convergent, however.)
Thus is the original series uniformly convergent?
Hint. One has $$ \sup_{x \in [0,1]}\left|(-1)^k\frac{x+k}{k^2} \right|=\frac1{k^2}+\frac1k\qquad (k\ge1) $$ thus the given series does not converge uniformly over $[0,1]$.