A classic example of a function being differentiable everywhere except at one point is the function $f(x) = \sqrt{\lvert x \rvert}$. This function is not differentiable at $x=0$ as is shown in this answer.
I was thinking about what was the required behavior for a function to have around $x=0$ for the derivative to be nonexistent. I believe the requirements are:
- The function needs to be continuous, since otherwise differentiability $\implies$ continuity would prove the statement is trivially true.
- The function has to be concave for all $x \neq 0$.
- The function needs to be minimum at the point $x=0$.
To me, this intuitively seems to guarantee the "pointiness" at $x=0$ that makes the function non-differentiable.
Inspired by the above I wanted to generalize the result. I propose the following theorem:
Given a continous function $f:\mathbb{R} \to \mathbb{R}$, if $f''(x) \le 0 $ on some interval $0<|x|< a$, for some $a\in (0,\infty]$, and $f$ is minimum at $0$, then $f'(0)$ doesn't exist.
Here is my attempt at proving the statement:
We analyze the limit $\lim_{h \to 0^+}$ and $\lim_{h \to 0^-}$ separately.
We see that $\lim_{h \to 0^+} \frac{f(0 + h) - f(0)}{h} > 0$ since $h>0$ and $f(h)-f(0)> 0$ using that the function is minimum at $0$.
Similarly, $\lim_{h \to 0^-} \frac{f(0 + h) - f(0)}{h} < 0$ since $h<0$ and $f(h)-f(0)> 0$ using that the function is minimum at $0$.
So since the limit from the right is positive, but the limit from the left is negative, then the limit doesn't exist. QED.
This attempt troubles me because I didn't use the condition that the function has to be concave for all $x \neq 0$, so if the proof were correct this would mean that the theorem would also hold for functions like $x^2$ which are also minimum at $x=0$, but this is clearly wrong!
I can't seem to find exactly what's wrong with the argument, but because of the lack of my use of the hypothesis I know it is indeed wrong. Can anyone tell me how I could correct my proof to make it valid?
Edit: The case where $f''(x) =0$ has a family of counterexamples as Theo Bendit pointed out in his answer. However, the question was inspired by functions that have similar behavior to $f(x) = \sqrt{\lvert x \rvert}$, so any suggestions on how to prove the statement with $f''(x) <0$ instead of $f''(x) \le 0$ are greatly appreciated.
Suppose $f:\mathbb R\to \mathbb R$ is continuous, $f(0)$ is the minimum value of $f,$ and $f''(x)<0$ for $x\ne 0.$ We want to show $f'(0)$ does not exist.
Suppose to reach a contradiction that $f'(0)$ exists. Because the minimum value of $f$ is $f(0),$ we have $f\ge f(0)$ everywhere in $[0,\infty).$ We can't have $f=f(0)$ everywhere in $[0,\infty),$ since that would imply $f''\equiv 0$ in $(0,\infty).$ It follows that $f(a)>f(0)$ for some $a>0.$ For this $a$ we have
$$\frac{f(a)-f(0)}{a-0} >0.$$
By the MVT the above equals $f'(c)$ for some $c\in (0,a).$ Because $f''<0$ on $(0,c),$ $f'$ is strictly decreasing on $(0,c).$ Thus $f'(x)>f'(c)$ for $x\in (0,c).$ Using the MVT again, we see $f(x)\ge f(0)+f'(c)x$ throughout $[0,c].$ This shows
$$f'(0)=\lim_{x\to 0^+} \frac{f(x)-f(0)}{x-0} \ge f'(c)>0.$$
Exactly the same kind of argument works to the left of $0$ to show $f'(0)<0.$ But easier than that is to note $f\ge f(0)$ on $(-\infty,0],$ and so all difference quotients to the left are $\le 0.$ It follows that $f'(0)\le 0.$ This is a contradiction since we obtained $f'(0)>0$ above.