I am wondering if I minimize a non-smooth convex function, which solver should I choose. I think I should choose a fastest one with a big convergence rate. Subgradient descent is always on the textbook. But it is slow. I am seeking a state of the art optimization algorithm for non-smooth case.
I actually study a general case. E.g., the regression problem $$\|Kx-y\|+|x|$$, where $$x\in\mathbb{R}^d,K\in\mathbb{R}^{n\times d}, y\in \mathbb{R}^n$$. I mean I not only try to solve the regression problem, but also others. So I cannot make it too specific. Got a good studff,http://napsu.karmitsa.fi/nsosoftware/. There are several algorithms for nonsmooth case. Which should I use?