I have some issues in a convex optimization problem. My f(X) is a convex function of X where X is a positive definite matrix. X is very sparse and has a handful of non zeros values. Now I only need to optimize over these handful variables included in X. I can calculate the gradient over just these handful of variables.
I was thinking of using quasi newton method which approximates the hessian using BFGS. Now my question is lets say my current gradient is G and approximate hessian H, then I will update my params as
$x_{t+1} = x_t -\alpha H^{-1}G$.
If I define the step size by using Armijo's rule can I prove that for a particular $\alpha$ > 0 , I will have the X positive definite and it will decrease the value of the objective function or it is not guaranteed?