How important is the ratio of negative eigenvalues to the convexity of the optimization objective?

90 Views Asked by At

I have to minimize $f(x)$ with the Hessian matrix $H \in \mathbb{R}^{n\times n}$. Considering $H$ has one positive eigenvalue with the value of $10^5$ and the rest are equal to $-1$. In that case, can I rely on a gradient descent optimizer to obtain a global minimum for $f(x)$?

Or let's say "what can we say about the convergence point of the convex solver?" Can I claim the convergence point would be really close to the global optimum?