Baby Rudin Ch. 5 Exercise 2: How does differentiability imply continuity at endpoints?

452 Views Asked by At

(Baby Rudin Chapter 5 Exercise 2)

Suppose $f'(x) > 0$ in ($a, b$). Prove that $f$ is strictly increasing in ($a, b$), and let $g$ be its inverse function.

Prove that $g$ is differentiable, and that $g'(f(x)) = \frac{1}{f′(x)} \; \; \; (a < x < b)$

I attempted the proof as follows:

Suppose $f'(x)>0$ in $(a, b)$ for some $x \in (a, b)$. Then, $f$ is continuous on $\color{blue}{[a, b]}$ and by the Mean Value Theorem, $\frac{f(b)-f(a)}{b-a}>0 \implies f(b)-f(a)>0$ and we have that $f$ is strictly increasing in $(a, b)$. Now, let $g = f^{-1}(x), t\in (a, b)$ such that $t\ne x$ and $\epsilon>0$. Put $\delta = \epsilon |t-a|>0$. Then, if $0< |t-x|< \delta$, then \begin{align*} 0< |t-x|< \epsilon |t-a| &\implies |f^{-1}(f(t))-f^{-1}(f(x))|< \epsilon |t-a| \\ &\implies \left|\frac{f^{-1}(f(t))-f^{-1}(f(x))}{t-x}\right|< \left|\frac{f^{-1}(f(t))-f^{-1}(f(x))}{t-a}\right|< \epsilon \\ &\implies \left|\frac{g(f(t))-g(f(x))}{t-x}\right|< \epsilon \end{align*} and we have that $g$ is differentiable. Next, put $h(t) = g(f(t)) = f^{-1}(f(t)) = t$. Note that we satisfy the hypothesis of the Theorem on the chain rule of differentiation; by its consequent, we have \begin{equation*} g'(f(x)) = \frac{h'(x)}{f'(x)} = \frac{1}{f'(x)} \end{equation*} and we are done.

My question: On second reading, I realized that it is not correct to say that $f$ is continuous on $[a, b]$ and that I could only conclude that $f$ is continuous on $(a, b)$ based on the fact that $f$ is differentiable on $(a, b)$. Is this true? If so, is there a way to prove that $f$ is continuous on $[a, b]$, or do I need to completely overhaul that part of the proof? Also, is the rest of my proof correct?

1

There are 1 best solutions below

5
On



To prove $f’(x)>0$ on $(a,b)$ $ \implies$ $f(x)$ monotonic on $(a,b)$

The argument to is to prove $f(x_{i})<f(x_{j})$ for every $x_{i},x_{j}$ in the interval $(a,b)$ For this can you take the interval $[x_{i},x_{j}]$ and then apply the Mean-value Theorem to get the inequality. Hence you do not need continuity on $[a,b]$ but just on $(a,b)$.


For the second theorem.

The derivative of $g$ at $y_{0}$ is $$\lim_{y\to y_{0}}{g(y)-g(y_{0})\over y-y_{0}}$$ [Where $y_{0}$ = $f(x_{0})$ The reason for this is $g$ takes inputs that are outputs of $f$ .
$x_{0}$ is a point in the interval(domain of $f$)].

The derivative is equal to $$\lim_{y\to y_{0}}\dfrac{1}{ \dfrac{f(g(y))-f(g(y_{0}))}{g(y)-g(y_{0})}}$$ (the denominators can’t be zero since $f$ and $g$ are one to one functions) since $f(g(y))=y$.

The denominator is equal to $f(x)-f(x_{0})\over x-x_{0}$ and hence

The limit of the denominator is $f’(x)$ (Since $y\to y_{0} \implies x\to x_{0}$ as g is continuous(inverse of monotonic continuous function is continuous ) ) hence $g’$=$1/f’(x_{0})$ by reciprocal law of limits .

Hence the limit exists and the value is computed.