I wanted to test the uniform convergence of $f_n(x)={1\over 1+(x-n)^2}$ in $[0,\infty[$ and $]-\infty,0]$ seperately. I tried the $M_n$ test and found the limit function $f(x)$ to be $0$. Further, I obtained $M_n=sup\vert f_n(x)-f(x) \vert=1$ at $x=n$.
So, $x=n\to \infty$ is the point of non uniform convergence here and hence $f_n(x)$ is NOT convergent uniformly in any interval which contains $\infty$, say $[a,\infty[, a$ is any finite real.
Thus I concluded $f_n(x)$ to be uniformly convergent in $]-\infty,0]$ but not in $[0,\infty[$. Am I right in my argument? Any alternative approach to this?
The $1$ you're talking about is in the range space. The maximum for each $f_n$ occurs at $x=n,$ so there's nothing special about the value of $1$ in the domain.
But you're right about the fact that the functions having the same maximum value means they cannot converge uniformly on $\mathbb R$ to the zero function. But note on a finite domain, it does, since the maxima at $n$ occur further and further outside the domain. By this same reasoning, $f$ converges uniformly on $(-\infty,0]$ and not $[0, \infty).$