Convergence of method of Newton modified with minimizing step size

65 Views Asked by At

This is an exercise from Bazaraa's Nonlinear Programming book in Chapter 8.

Given a twice continuously differentiable function $f$ with invertible-everywhere Hessian $H$, and let $x_{k+1}=x_k - t_k H(x_k)^{-1} \nabla f(x_k)$, where $t:=t_k$ minimizes $f(x_k +td_k)$. We need to show this modified method converges to a point in the set $\{x:\nabla f(x)^TH(x)^{-1}\nabla f(x)=0\}$.

I am not quite sure how to approach this, as what I know is only that the derivative of $f(x_k +td_k)$ is zero w.r.t $t$ at $t_k$. I tried to follow the idea of a convergence proof for method of Newton but the set here is not singleton.

Any idea is appreciated! Thanks.