error bounds for the subgradient method

26 Views Asked by At

I am struggling to understand the bound for constant stepsize with the subgradient method provided in this image taken from page 4 of this link , where $f^{(k)}_{best}$ is the best function value seen so far in the subgradient path. I dont follow how we conclude that $f(x^{k}) - f^{*} \leq G^{2}h$, where $h$ is the stepsize and $G$ is the Lipschitz constant. Since the subgradient method is not a descent algorithm, I would have thought that $x_{k}$ could diverge to $\infty$. Apologies if I am missing something basic here