Why the optimization of convex function (least squares with $ {L}^{1}$ regularization) does not converge?

334 Views Asked by At

I am trying to find the best x which fits $Ax=B$ and by this aim I am using the following cost function.

$$\min\|Ax-B\|_2+\beta\|x\|_1$$

but when I use optim() function in R for optimization, it converges in a bad point which is not at least near the point (x = psudoinverse(A)*B). I can initialize the x by this answer to get better result but I want to know the reason of not converging because the cost function is convex and it have to converge.

2

There are 2 best solutions below

4
On BEST ANSWER

The optim() function will not solve your problem, since it is non-smooth (there are points where the derivative do not exist). The algorithms used by optim() rely on the on smoothness of the objective function. However, your problem is equivalent to an SOCP: $$ \begin{aligned} \min_{x, s \in \mathbb{R}^n, t \in \mathbb{R}} &\quad t + \beta \sum_{i=1}^n s_i \\ \text{s.t.} &\quad ||Ax - B||_2 \leq t \\ &\quad x_i - s_i \leq 0 &\quad i = 1, \dots, n\\ &\quad -x_i -s_i \leq 0 &\quad i = 1, \dots, n \end{aligned} $$ The last two constraints come from the reformulation of $|x_i| \leq s_i$ as $-s_i \leq x_i \leq s_i$, and splitting into two linear inequalities.

Now, you can use an SOCP solver, such as the socp function in CLSOCP package to solve your problem.

2
On

According to R's documentation the function you mention, optimize(), only support 1D function.

Moreover, pay attention the LASSO Problem you're trying to minimize isn't smooth.
Hence you can't use method which assumes the objective function is smooth.