Consider the following standard real analysis textbook theorem:
Let $I$ be a perfect interval, $f\colon I \to \mathbb{R}$ be $C^3$ (i.e. three times differentiable and $f'''$ continuous).
If $x_0 \in \mathring{I}$ with $f'(x_0) = f''(x_0) = 0$ and $f'''(x_0) \neq 0$ then $x_0$ is not an extremal value of $f$.
Is this theorem true if:
- $f'''$ is continuous only in $x_0$
- $f'''$ is not continuous in $x_0$
If so, how to prove it, if not, what would be a counter example?
Edit
I guess that one doesn't need the continuity. Suppose $f'''(x_0) > 0$, then by the second derivative test (which doesn't need continuity) $f'(x_0)$ is a Minimum of $f'$. Since also $f'(x_0) = 0$, there is a neighborhood $U(x_0) \subset I$ of $x_0$ such that $f'(x) > 0$ for all $x \in U(x_0)\setminus\{x_0\}$. This implies that $f$ is strictly monotonically increasing both on $I_L := ]-\infty;x_0] \cap U(x_0)$ and on $I_R := U(x_0) \cap [x_0;\infty[$. This means that $f(x) < f(x_0)$ $\forall x \in I_L$and that $f(x) > f(x_0)$ $\forall x \in I_R$ which in turn implies that $x_0$ is no extremal value of $f$.
However I am not sure if this argument is correct. In particular because I didn't find any reference which proofs the theorem in this way. The standard way seems to be using the Taylor series.
Actually, it is sufficient to have $f\in C^2$ whereas $f'''$ need exist only at $x_0$ (and of course we still require $f'(x_0)=f''(x_0)=0$, $a:=f'''(x_0)\ne 0$). Let's show this in detail for the case that $a:=f'''(x_0)>0$, say (though you essentially already did). From the definition of $\lim_{h\to0}\frac{f''(x_0+h)-f''(x_0)}{h}=a>0$ we get that for suitable $\delta >0$, we have $\left|\frac{f''(x_0+h)-f''(x_0)}{h}-a\right|<\frac a2$ for all $h\ne 0$ with $|h|<\delta$. But then $\frac{f''(x_0+h)}h>\frac a2 $, i.e., $f''(x_0+h)>\frac {ah}2>0$ for $0<h<\delta$ and $f''(x_0+h)<\frac{ah}2<0$ for $0>h>-\delta$. By integrating, $$f'(x_0+h)=\int_0^hf''(x_0+h)\,\mathrm dx>\int_0^h\frac{ah}2\,\mathrm dx=\frac{ah^2}{4}>0 $$ for $0<h<\delta$ and $$f'(x_0+h)=-\int_h^0f''(x_0+h)\,\mathrm dx>\int_h^0\frac{a(-h)}2\,\mathrm dx=\frac{ah^2}{4}>0 $$ for $-\delta<h<0$. Integrating again, $$ f(x_0+h)-f(x_0)=\int_0^hf'(x_0+h)\,\mathrm dx>\int_0^h\frac{ah^2}4\,\mathrm dx=\frac{ah^3}{12}>0$$ for $0<h<\delta$ and $$ f(x_0)-f(x_0+h)=\int_h^0f'(x_0+h)\,\mathrm dx>\int_0^h\frac{ah^2}4\,\mathrm dx=\frac{ah^3}{12}>0$$ for $-\delta<h<0$. Hence there is no neighbourhood $U$ of $x_0$ such that $f(x_0)\ge f(x)$ for all $x\in U$, nor such that $f(x_0\le f(x)$ for all $x\in U$. In other words $x_0$ is not a local extremum.
In order to see that such a beast really exists, start with your favourite continuous, nowhere differentiable function $w(x)$ and make sure hat $f''(x)=x^2w(x)+x$, i.e., let $$f(x)=\int_0^x\int_0^u(t^2w(t)+t)\,\mathrm dt\,\mathrm du $$ (and of course pick your $I$ such that $0$ is an interior point).