I state the exercise:
Given $f: [0, + \infty) \rightarrow R$, f continuous.
Prove that if $\lim_{x \rightarrow \infty} f(x) = \lambda$, where $\lambda \in R$ then $f$ is uniformly continuous.
My solution:
I know that for an $M > 0$ we have that $\forall{x} > M \implies |f(x) - \lambda|< \epsilon/2$.
Now any $x_1, x_2 \in [0, +\infty)$ can lie either both in $[0,M]$ or both in $(M, \infty)$ or one in each of the aforementioned intervals.
In the first case we can invoke the Heine-cantor theorem that proves the function is uniformly continuous.
In the second case we have that $|f(x_1) - \lambda| < \epsilon/2$ and $|f(x_2) - \lambda| < \epsilon/2$ this implies that $|f(x_1) - f(x_2) | < \epsilon$ by a case by case analisis.
In addition we can reduce the third case to the first case by choosing $M_1 = \max \{x_1, x_2 \} + 1$, because $\forall{x} >M_1 > M$ we still have the implication needed for the second case, i.e., that $|f(x) - \lambda|< \epsilon/2$.
I have a different solution to the third case but was wondering if this would suffice? In any case, is my solution correct? any other ways of doing this? points where I need to be preciser?
PS: Ah I had a doubt during the exercise that I was never explicitly finding the $\delta > 0$ s.t. $|x_1 - x_2| < \delta \implies |f(x_1) - f(x_2) | < \epsilon$. I think in the second case any $\delta$ would do, correct?
An easy way to do this is to circumvent the problem that $x_1 \in [0,M]$ and $x_2 \in (M,\infty)$ by choosing $\delta$ which traps them both in one of the sets. To do this, we need to make the sets overlap a bit, but that is no problem.
Let $\varepsilon > 0$ and choose $M > 0$ such that for $x > M$ we have $$\lvert f(x) - \lambda \rvert < \varepsilon / 2.$$ Since $f$ is continuous and $[0,M+1]$ is compact, we know that $f$ is uniformly continuous there. Hence there is $\delta_1 > 0$ such that for all $x_1, x_2 \in [0,M+1]$ we have $$\lvert f(x_1) - f(x_2) \rvert < \varepsilon.$$ Also, for any $x_1, x_2 \in (M, \infty)$ we have $$\lvert f(x_1) - f(x_2) \rvert \le \lvert f(x_1) - \lambda \rvert + \lvert f(x_2) - \lambda \rvert < \varepsilon /2 + \varepsilon/2 = \varepsilon.$$ Choose $\delta = \min \{\delta_1, \tfrac 1 2 \}$. Then for any $x_1, x_2 \in [0,\infty)$ such that $\lvert x_1 - x_2 \rvert < \delta$, we must have that $$x_1,x_2 \in [0,M+1] \,\,\,\,\,\,\,\,\,\, \text{ or } \,\,\,\,\,\,\,\,\,\, x_1,x_2 \in (M,\infty).$$ Indeed, we may have both, but the point is, we've 'eliminated' this case where $x_1 \in [0,M]$ and $x_2 \in (M,\infty)$. In either case, as we have already noted, we have $\lvert f(x_1) - f(x_2) \rvert < \varepsilon$. Since this $\delta$ depends only on $\varepsilon$ (not on $x_1$ or $x_2$), this shows that $f$ is uniformly continuous.