If $f$ is continuous, $f'(c)=0$ and $f''(c)<0$, does that imply that there exists some interval around $c$ such that $f(x)<f(c)$ for all $x$ in the interval?
I've discussed this on reddit and I think the answer is no. It would be true if $f''$ was continuous at $c$. However I cannot seem to find an explicit function as a counterexample. Can anyone think of one?
EDIT: In fact the implication is true without $f''$ continuity at $c$ answered below.
Also can anyone clarify that concavity at $c$ does not make sense since concavity/concexity only makes sense over an interval? I believe I got a wrong impression from school or a book, or something.
We assume that $f$ is two-times differentiable in $(a,b)$ with $c\in (a,b)$ (no need of the continuity of $f''$). By the definition of second-derivative, $$0>f''(c)=\lim_{x\to c}\frac{f'(x)-f'(c)}{x-c}=\lim_{x\to c}\frac{f'(x)}{x-c}$$ which implies that, by the definition of limit, there is $\delta>0$ such that , $$\forall x\in(c-\delta,c)\cup(c,c+\delta),\quad\frac{f'(x)}{x-c}<0$$ that is $f'(t)>0$ in $(c-\delta,c)$ and $f'(t)<0$ in $(c,c+\delta)$. Now note that by the Mean Value Theorem, if $x\in(c-\delta)\cup(c,c+\delta)$ then for some $t$ between $x$ and $c$ $$f(x)-f(c)=f'(t)(x-c)<0$$ that is $c$ is a strict local maximum point in $(a,b)$.