Consider the the sequence of functions $\{f_n\}=\left \{\sqrt{x^2+\frac{1}{n^2}}\right\}$.
In Chapter 24 "Uniform Convergence and Power Series" of Spivak's Calculus we are asked to determine the pointwise limit of this sequence on the interval
$$[a,\infty), a>0$$
if it exists, and also determine whether this sequence converges uniformly to $f(x)=\lim\limits_{n\to\infty} f_n(x)$.
My solution to this problem differed from the solution manual. I'd like to show the solution manual solution first, and then my solution. My question is if my solution is correct (and also, if $a$ can be $0$?).
The solution manual uses the Mean Value Theorem to show that there is uniform convergence. Here is the argument in full
$$f_n(x)-f(x)=\sqrt{x^2+\frac{1}{n^2}}-\sqrt{x^2}$$
Let $g(x)=\sqrt{x}$.
Then, by the MVT we have that
$$\frac{g(x^2+\frac{1}{n^2})-g(x^2)}{x^2+1/n^2-x^2}=g'(\alpha)$$
for $\alpha\in (x^2,x^2+1/n^2)$.
Thus
$$f_n(x)-f(x)=\frac{1}{2n^2\sqrt{\alpha}}\tag{1}$$
Now,
$$x^2<\alpha<x^2+\frac{1}{n^2}\implies x<\sqrt{\alpha}\implies > \frac{1}{x}>\frac{1}{\alpha}$$
Hence, from (1) we now have
$$f_n(x)-f(x)=\frac{1}{2n^2\sqrt{\alpha}}<\frac{1}{2n^2}\cdot\frac{1}{x}<\frac{1}{2n^2a}$$
Since this difference doesn't depend on $x$, we can make it arbitrarily small for all points in $[a,\infty)$ simultaneously. Thus, $\{f_n\}$ converges uniformly to $f$ on this interval.
And now here is my solution
First I plotted $f_n$ for $n=1$ to $5$
First of all, $f(x)=\lim\limits_{n\to\infty} f_n(x)=x$.
Note that $f_n(0)=\frac{1}{n}$ and
$$[f_n(x)-f(x)]'=\frac{x}{\sqrt{x^2+1/n^2}}-1<0$$
At this point I concluded that since this difference is $\frac{1}{n}$ at $x=0$ and decreases for any $x>0$, then $f_n$ uniformly converges to $f$ on $[0,\infty]$.
Is this reasoning correct? If it is, why did the problem ask us to show uniform convergence excluding $a=0$?
EDIT: Now that I've just moved on to item (iv) I noticed that it asks about uniform convergence of the same sequence of functions on $\mathbb{R}$.
Thus, it seems like I pretty much am answering my question here but we have
and we can use the same reasoning to show that the derivative of $f_n(x)-f(x)$ is larger than $0$ for $x<0$.
Thus, for all $x$ $f_n(x)-f(x)\leq \frac{1}{n}$, and $f_n$ converges uniformly to $f$ on all of $\mathbb{R}$.


Your idea is correct. You ended up showing that $$ \sup_{x\in \mathbb{R}}|f(x)-f_n(x)| = |f(0)-f_n(0)| = \frac 1n \to 0 \quad(n\to \infty). $$
This shows that $f_n \to f$, uniformly in $\mathbb{R}$.