First let me write the statement of delta method from wikipedia:
If there is a sequence of random variables $(X_n)$ satisfying
$\sqrt{n}[X_n-\theta]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2)$where $\theta$ and $\sigma^2$ are finite valued constants, then
$\sqrt{n}[g(X_n)-g(\theta)]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2\cdot[g'(\theta)]^2)$
for any function ''$g$'' satisfying the property that $g′(\theta)$ exists and is non-zero valued.
Among almost all statements I could find this theorem requires the derivative to be nonzero. However, as far as I know, mean value theorem is used in the proof and therefore the argument should follow even for zero derivatives.
My question is, what part of the proof could possibly go wrong if the derivative is zero? If this result works even for zero derivative case(the asymptotic distribution degenerate to a mass point at 0), why people have to specify that $g'(\theta)\neq 0$?
Thank you!