Convergence rate when solving L1 regularized optimization via coordinate descent with tiny step?

153 Views Asked by At

Wondering if there is an established result for the convergence rate when solving L1 regularized optimization via coordinate descent with tiny step? By "tiny step" I mean the step is always set to a very small positive constant, involving none of those step selection technique.

1

There are 1 best solutions below

3
On

A weak result derived from Boyd & Vandenberge "Convex Optimization" p. 479 is that for $l_1$-steepest descent (coordinate descent), with a step size of $t=\frac{1}{M}$ we have that $$ f(x^{(k)}) - p^* \leq c^k(f(x^{(0)})-p^*)\\ c = \left(1-\frac{m}{Mn}\right) < 1 $$ where the Hessian of $f$ is bounded by $MI\succeq \nabla^2 f(x) \succeq mI$. I would advise looking at that section to see if you can get something better.

EDIT: This doesn't 100% answer the original question, my apologies. However, $l_1$-regularized programs can be recast as SOCPs. Some techniques for optimization are listed on p599 and p601 of the above reference.