I am calculating a density using Kernel Density Estimation. The resulting density function is a sum of Gaussians with arbitrary $\mu_i$ and $\Sigma_i$:
$$ density(\mathbf x) = \sum_i {\frac {1}{\sqrt {(2\pi )^{k}|{\boldsymbol {\Sigma_i }}|}}} \cdot \exp \left(-{\frac {1}{2}}({\mathbf {x} }-{\boldsymbol {\mu }_i})^{\mathrm {T} }{\boldsymbol {\Sigma_i }}^{-1}({\mathbf {x} }-{\boldsymbol {\mu }_i})\right) $$
For this case, or even more general:
If I have (such) a multivariate function that is continuous, which value is bounded (here by 0 and an arbitrary real value), the norm of each $gradient(\mathbf x)$ is bounded: Is the second derivative $hessian(\mathbf x)$ of the function always invertible?
The answer is negative.
Consider the uni-dimensional Gaussian
$$\varphi(x) = \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}x^2}$$
The second derivative (Hessian) is $\varphi^{(2)}(x)= H_2(x)\varphi(x)$ where $H_2$ is the second Hermite polynomial which vanishes at $-1,1$.
More intuitively: the Gaussian has inflection points.