This is the context in which my question lies. See below for the actual question. Let $f(x)$ be differentiable everywhere and have a minimum at $x^*$. Then for every $x$ in a proper neighbourhood of $x^*$:
$$ \left| \frac{f(x)-f(x^*)}{x-x^*} -f'(x^*) \right| < \epsilon \in \mathbb{R^{++}} $$
If $f'(x^*) \neq 0$, we may set $\epsilon = \frac{1}{2} \left| f'(x^*) \right| $ and so:
\begin{align} &\left| \frac{f(x)-f(x^*)}{x-x^*} -f'(x^*) \right| < \frac{1}{2} \left| f'(x^*) \right| \notag\\ & f'(x^*)-\frac{1}{2}\left| f'(x^*) \right| < \frac{f(x)-f(x^*)}{x-x^*} < f'(x^*)+\frac{1}{2}\left| f'(x^*)\right| \tag{*}\label{*} \end{align}
If we assume $f'(x^*)>0$, the first inequality in the last line of \eqref{*}, brings:
$$\frac{f(x)-f(x^*)}{x-x^*} > f'(x^*)-\frac{1}{2} f'(x^*) >0$$
which, for $x<x^*$, contrasts with $x^*$ being a minimum.
If we assume $f'(x^*)<0$, the first inequality in the last line of \eqref{*}, brings:
$$ \frac{f(x)-f(x^*)}{x-x^*} < f'(x^*) - \frac{1}{2} f'(x^*) <0 $$
which, for $x>x^*$, again contrasts with $x^*$ being a minimum.
Therefore, to get a minimum, it is necessary that $f'(x^*)=0$ (and one can prove the same for a maximum with "symmetric" reasoning).
I think that, given the assumptions for $f$, I could rework these lines to prove that a local extrema is unique in a proper neighbourhood when the condition $f'(x^*)=0$ is met. Is it possible to deal with uniqueness along these lines without recalling other theorems? In particular, if the statement is false, is there is a related, true, statement?
I guess you are wrong. A vanishing derivative only tells you have found a stationary point. Just consider $f(x)=x^3$ for $x=0$.
If $f(x)$ is convex, then $f'(x)=0$ implies a local minima but not its unicity, think about $f(x)= c$, with $c$ any constant.
For a (local) minimum to be unique you need the positive semidefinitness of the Hessian matrix, ore in one dimension a strictly positive second derivative.