Continuity of $f$ doesn't imply differentiability

82 Views Asked by At

I am currently working with Steward's book "Calculus". I found the following proof for the

Theorem: If $f$ is differentiable at $a$, then $f$ is continuous at $a$.

Proof: By assumption

$$f'(a) = \lim_{x \to a}\frac{f(x) - f(a)}{x -a}$$

exists. Thus we can write

$$\begin{align} \lim_{x \to a} \left[ f(x) - f(a) \right] & = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \left( x-a \right) \\ & = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot \lim_{x \to a} \left( x-a \right) \\ & = f'(a) \cdot 0 = 0\end{align}$$

Now we have

$$\begin{align} \lim_{x \to a}f(x) & = \lim_{x \to a} \left[ f(a) + \left( f(x) - f(a) \right) \right] \\ & = \lim_{x \to a} f(a) + \lim_{x \to a} \left[\left( f(x) - f(a) \right) \right] \\ & = f(a) + 0 = f(a)\end{align}$$

Therefore $f(x)$ is continuous at $a$. $\blacksquare$

Now I came upon the idea, to work this proof backwards and try to see, where it fails. It should fail, since continuity of $f$ doesn't imply differentiability. I post my train of thoughts here and ask, if I am correct.

From $\lim_{x \to a}f(x) = f(a)$, which is given by continuity of $f$, we can deduce that $\lim_{x \to a} f(a) =f(a)$. This is intuitively correct. Using a delta-epsilon-argument, working from

$$ 0 < |x-a| < \delta \implies |f(x) - f(a)| < \epsilon $$

we have

$$ 0 < |x-a| < \delta \implies |f(a) - f(a)| < \epsilon $$

which is true for all $\epsilon > 0$, regardless of $\delta$.

Now we can use the same intermediate step as above:

$$\begin{align} \lim_{x \to a} \left[ f(x) - f(a) \right] & = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \left( x-a \right) \\ & = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot \lim_{x \to a} \left( x-a \right) \\ & = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot 0 = 0\end{align}$$

Now, my first impulse was to say that $\lim_{x \to a} \left[ f(x) - f(a) \right] = \lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot 0$ is true for all values of $\lim_{x \to a}\frac{f(x) - f(a)}{x -a}$ so that we cannot make any statements about it's value. But I think this not correct, since if $\lim_{x \to a}\frac{f(x) - f(a)}{x -a}$ doesn't exist, the whole equation is not defined. So I think it is better to say that, from here we can't go any further, since $\lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot 0 = 0$ doesn't give any information on the existence or value of $\lim_{x \to a}\frac{f(x) - f(a)}{x -a}$.

Now I ask in particular about my assumption $\lim_{x \to a} f(a) = f(a)$ and my statement regarding $\lim_{x \to a}\frac{f(x) - f(a)}{x -a} \cdot 0 = 0$ not giving any information on the existence or value of $\lim_{x \to a}\frac{f(x) - f(a)}{x -a}$. Is my terminology correct there? When writing about math, I found that small digressions in terminology can lead to wrong arguments, so I want to be sure here.

Sorry if this seems silly, but the idea caught my mind and I wanted to see, how far I could go in working the above proof backwards. Now I need to know, if my reasoning is correct. Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

You are right: from the fact that$$\lim_{x\to a}\frac{f(x)-f(a)}{x-a}(x-a)=0,$$you cannot deduce that$$\lim_{x\to a}\frac{f(x)-f(a)}{x-a}\cdot\lim_{x\to a}(x-a)=0.\tag1\label a$$The equality \eqref{a} assumes that both limits that appear there exist, and you have no reason no assume that.

More generally, the equality$$\lim_{x\to a}\bigl(f(x)g(x)\bigr)=\lim_{x\to a}f(x)\cdot\lim_{x\to a}g(x)$$holds if both limits $\lim_{x\to a}f(x)$ and $\lim_{x\to a}g(x)$ exist, but the existence of $\lim_{x\to a}\bigl(f(x)g(x)\bigr)$ doesn't allow you to deduce that the limits at $a$ of $f$ and $g$ exist.

On the other hand, you need no assumption whatsoever about $f$ (other than the fact that $a$ belongs to its domain) to know that $\lim_{x\to a}f(a)=f(a)$. For any constant function $\lambda$ and any $a\in\Bbb R$, $\lim_{x\to a}\lambda=\lambda$.