I have the functions $$ f_n(x) = x + x^n(1 - x)^n $$
that $\to x$ as $n \to \infty $ (pointwise convergence).
Now I have to look whether the sequence converges uniformly, so I used the theorem and arrived at: $$ \sup\limits_{x\in[0,1]}|f_n(x)-x| = |x+x^n(1-x)^n - x| = |x^n(1-x)^n| = 0, n \to \infty $$
However, the master solution suggested something different that I'm unable to comprehend:
$$ \forall x \in [0,1]: 0 \leq x(1-x) = x - x^2 = 1/4 - (x - 1/2)^2 \leq 1/4 $$ $$ \sup\limits_{x\in[0,1]}|f_n(x) - x| = \sup\limits_{x\in[0,1]}|(x(1-x))^n| \leq (1/4)^n \to 0, (n \to \infty)$$
So, is my solution wrong? And how did they come up with that estimate? Or the better question is; how can I learn to come up with that stuff? First semester University and Analysis is... difficult.
When computing $\sup\limits_{x\in[0,1]}|f_n(x)-x|$ you should deduce something that does not depend on $x$.