This concerns the convergence of Newton's method in unconstrained optimization. But the question can be taken out of this context.
Suppose $f$ is twice differentiable and that the Hessian $\triangledown^2f(x)$ is Lipschitz continuous in a neighborhood of $x^*$, and $\triangledown^2f(x)$ is invertible at $x=x^*$. Questions are
- Is there a neighborhood of $x^*$ where $\triangledown^2f(x)$ is invertible?
- Is there a neighborhood of $x^*$ where $||\triangledown^2f(x)^{-1}||_2\leq2||\triangledown^2f(x^*)^{-1}||_2$
I've taken one semester of Analysis and Numerical Linear Algebra, but couldn't crack this. If you know the answer to the question, could you also suggest a book to read where this sort of subject is covered?