When looking over true/false questions on previous midterms, one of my conscientious students said:
"If f is defined on an open interval containing c, f'(c)=0, and f''(c)>0, then c is a local min of f"
was false because one of the hypotheses for the second derivative test (at least in Stewart) is that the second derivative is continuous in a neighborhood of c.
Can anyone think of a counterexample for this statement (in one real variable)? It's apparently been too long since I've taken an analysis class to come up with something clever.
As pointed out in the comments, the fact that $f'(c)=0$ is enough; the continuity of the second derivative is not required for a single variable case.