Existence of accelerated subgradient methods

81 Views Asked by At

Heavy Ball method and Nesterov's gradient method are two kinds of accelerated versions of gradient methods that achieve optimal convergence for smooth optimization. I wonder whether there is an accelerated version of subgradient method. If no, can you give an intuition for why the acceleration does not work for non-smooth problems?

1

There are 1 best solutions below

0
On

The text Accelerated Methods for the SOCP-relaxed Component-based Distributed Optimal Power Flow talk about a particular problem with an accelerated subgradient method.

Perhaps you can find another texts searching, with some math expression, to accelerated subgradient method on SearchOnMath.