Consider the function $f$ below:
$f(x) = -\log x$
Its second derivative is positive for all $x$, therefore it is convex:
$\Large \frac{d^2f}{dx^2} =\frac{d(\frac{-1}{x})}{dx} = \frac{1}{x^2}>0, \forall x$
However, setting its first derivative to zero, we find that the minimum of the function is attained at infinity? So can we still say that it has a minimum? The negative log is a very popular "loss function" in machine learning - it is interesting that they are trying to minimize a function whose minimum is literally unattainable.
It seems that you have misunderstood the following principle:
The reason it doesn't apply here is that $f'(x)$ is never equal to $0$. So it has no minimum value.