Since hessian $\nabla^2f(x^*)$ is non-singular, there is a radius $r > 0$ such that $\|\nabla^2 f_k^{−1}\| \le 2\|\nabla^2f(x^*)^{−1}\|$ for all $x_k$ with $\|x_k − x^*\| \le r$. $x_k$-is the $k$-th iteration point of Newton method. $x^*$-is the optimum.
I met this statement at the book "Numerical Optimization"-Nocedal, Wright at the equation ($3.32$). I don't have any clue why is this true?
The point is that the mapping $A \mapsto A^{-1}$ is continuous on the set of regular matrices (this follows, e.g., from the Neumann series). Hence, $\nabla^2 f_k^{-1} \to \nabla^2 f(x^*)^{-1}$ if $f$ is twice continuously differentiable and $x_k \to x^*$.