Showing that $f$ is a uniformly continuous function on the interval $[0, \infty)$

574 Views Asked by At

Suppose that $f$: $[0, \infty) \rightarrow \mathbb{R}$ is continuous on $[0, \infty)$ and differentiable on $(1, \infty)$ with bounded derivative. Show that $f$ is uniformly continuous. (HINT: split $[0, \infty)$ into two pieces, on each of which $f$ is uniformly continuous, then explain why this implies that $f$ is uniformly continuous on the whole $[0, \infty)$.)

Question: I have solved the majority of the problem, but wanted a little clarification with regards to the explanation in the second part. Specifically there is the situation where:

$ x \in [0, 1)$ and $y > 1$ (or vice versa). In this scenario I still have to show $|x - y| < \delta$. Given the conditions that I have can I do the following: $$|x - y| \leq |x - 1| + |y - 1| < \frac{\delta}{2} + \frac{\delta}{2} = \delta$$

And if this is allowed is the reason that I should choose $1$ because the two intervals split at that specific point so it allows me to relate $x,y$ to each other?

Assuming the previous step is true then it follows:

$$ |f(x) - f(y)| \leq |f(x) - f(1) + |f(1) - f(y)| < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$$

2

There are 2 best solutions below

1
On BEST ANSWER

Use MVT to show $|f(x)-f(y)|\le M|x-y|$ on $(1,\infty)$. Since $f$ is uniformly continuous on $[0,1]$, for any $\epsilon>0$ there exists $\delta>0$ so that $|f(x)-f(y)|<\epsilon$ when $|x-y|<\delta$ and $x,y\in[0,1]$. If $x\le 1\le y$, because $f$ is continuous at $1$, there is $\eta$ so that if $|x-y|<\eta$ implies $|x-1|,|y-1|<\eta$ and $$|f(x)-f(y)|\le |f(x)-f(1)|+|f(1)-f(y)|<\epsilon $$ Now if $|x-y|<\min\{\epsilon/M,\delta,\eta\}$, we get $|f(x)-f(y)|<\epsilon$.

0
On

The mean value theorem gives Lipschitz and therefore uniform continuity  of $f$ in $[1,\infty)$ including $1$. That is because you need  differentiablity only in the interior. Moreover, $f$ is uniformly  continuous in $[0,1]$ since every continuous function on compact sets  are automatically uniformly continuous. 

Now, let be $\varepsilon > 0$. Then, there exist $\delta_{1/2}$ such that $$ \lvert f(x_i) - f(1) \rvert <  \frac{\varepsilon}{2} $$ for $\lvert x_i - 1 \rvert < \delta_i$ where $0 \leq x_1 \leq 1 \leq x_2$. Of course, $\delta_i$ does not depend on $1$ nor $x_i$ due to the uniform continuity; this is important. Hence,  if we choose $\delta = \min\{ \delta_1,\delta_2 \}$, we obtain $$ \lvert  f(x_1) - f(x_2) \rvert \leq \lvert f(x_1) - f(1) \rvert + \lvert f(1) -  f(x_2) \rvert < \frac{\varepsilon}{2} + \frac{\varepsilon}{2} =  \varepsilon$$ since $$\lvert x_i - 1 \rvert \leq \lvert x_1 - x_2 \rvert < \delta \leq \delta_i$$ for $0 \leq x_1 \leq 1 \leq x_2$. For arbitrary $x_1,x_2$ both entirely in either $[0,1]$ or $[1,\infty)$, we can use the same $\delta$ due to the choices of each $\delta_i$, which proves the statement.