Is there any method for gradient descent that achieves acceleration while moving always in the opposite direction of the gradient?

69 Views Asked by At

I'm studying gradient descent methods, in particular Nesterov's methods and others that achieve a better complexity (in terms of access to the gradient oracle) than regular gradient descent. In particular, for a smooth objective, accelerated gradient descent uses $O(1/\sqrt{\epsilon})$ calls to the oracle as opposed to $O(1/\epsilon)$ of regular gradient descent.

I've been reading other methods that accelerate but all of them change the direction of descent. I was wondering if there is some method that always moves following the opposite direction of the gradient and that also only needs $O(1/\sqrt{\epsilon})$ to the gradient oracle.