Conjugate gradient with inaccurate gradient information

291 Views Asked by At

I am trying to minimize an unconstrained nonlinear optimization problem using conjugate gradient. I don't have exact gradient ($\nabla f(x)$) information, but I do have an approximate gradient ($\nabla f_a(x)$) such that $\| \nabla f(x) - \nabla f_a(x)\| \leq \delta$ for some constant $\delta >0$ for all $x \in \mathbb{R}^n$. Given the fact that $-\nabla f_a$ is a descent direction, will the conjugate gradient with this approx. gradient converge to some local minima ? If not true in general, can we say something for strictly convex quadratic functions ?