I theoretically understand this question and we've covered the basics of $\delta$-$\epsilon$ proofs but I really have no clue where to start here so please don't ask what I've got so far because I've got nothing. This is for a question sheet handed out during my real analysis course - no answers are provided so no way to get hints or anything.
Suppose $f: \mathbb{R} \rightarrow \mathbb{R}$ is $C^2$ everywhere and suppose that $c \in \mathbb{R}$ satisfies $f' (c) = 0$ and $f'' (c) > 0$
a) Prove that there exists $\delta > 0$ such that $|x - c| < \delta \implies f'' (x) > 0$
b) Prove that $f'$ is increasing on $[c - \delta, c + \delta]$ and deduce that $f' (x) > 0$ for $x \in (c, c + \delta)$
c) Prove that $c$ is a local minimum for $f$
A useful result:
Proof : Since $f$ is continuous at $x_0$, for $\varepsilon = f(x_0)/2$ there is a $\delta>0$ such that $|f(x)-f(x_0)| < \varepsilon$ for all $x \in (x_0-\delta,x_0+\delta)$. But this means that, for all $x \in (x_0-\delta,x_0+\delta)$, we have $$f(x)-f(x_0)>-f(x_0)/2,$$ and then $f(x)>f(x_0)/2>0$.