Show that
$f_n= \frac {2n^2x}{(1+n^2 x^2) \ln(1+n)}$
does converges uniformly in $x\in [a,\infty)$ for $a>0$
How do I prove the uniform convergence for that interval?
How differently is from doing it for $x\in\mathbb R$ ?
Why for $x\in\mathbb R$ it does not converges uniformly?
For $x\geq 0$ then $f_n(x)\to 0$ as $n$ goes to infinity. Moreover, since $$f'_n(x)=\frac{2n^2}{\ln(1+n)}\cdot\frac{(1-n^2x^2)}{(1+n^2x^2)^2}$$ it follows that $f_n$ is a nonnegative continuous function which is increasing in $[0,1/n]$, it attains the maximum value at $1/n$ and it is decreasing in $[1/n,+\infty)$. Hence, for $a>0$, when $n\geq 1/a$, $$\sup_{[a,+\infty)}|f_n(x)-f(x)|=f_n(a)\to 0,$$ and $(f_n)_n$ converges uniformly in $[a,+\infty)$. On the other hand $$\sup_{[0,+\infty)}|f_n(x)-f(x)|=f_n(1/n)=\frac {n}{\ln(1+n)}\to +\infty$$ which implies that $(f_n)_n$ does not converge uniformly in $[0,+\infty)$.