I am trying to find the best x which fits $Ax=B$ and by this aim I am using the following cost function.
$$\min\|Ax-B\|_2+\beta\|x\|_1$$
but when I use optim() function in R for optimization, it converges in a bad point which is not at least near the point (x = psudoinverse(A)*B).
I can initialize the x by this answer to get better result but I want to know the reason of not converging because the cost function is convex and it have to converge.
The
optim()function will not solve your problem, since it is non-smooth (there are points where the derivative do not exist). The algorithms used byoptim()rely on the on smoothness of the objective function. However, your problem is equivalent to an SOCP: $$ \begin{aligned} \min_{x, s \in \mathbb{R}^n, t \in \mathbb{R}} &\quad t + \beta \sum_{i=1}^n s_i \\ \text{s.t.} &\quad ||Ax - B||_2 \leq t \\ &\quad x_i - s_i \leq 0 &\quad i = 1, \dots, n\\ &\quad -x_i -s_i \leq 0 &\quad i = 1, \dots, n \end{aligned} $$ The last two constraints come from the reformulation of $|x_i| \leq s_i$ as $-s_i \leq x_i \leq s_i$, and splitting into two linear inequalities.Now, you can use an SOCP solver, such as the
socpfunction in CLSOCP package to solve your problem.