$ {L}_{1} $ (L1) Norm Regularized Minimization with of Convex Function with Linear Equality Constraint Using ADMM Framework

1.1k Views Asked by At

In section 6.3 of Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers there is a method for minimizing a loss function with l1 regularization. i.e.

minimize $l(\bf{x})+\lambda||x||_1$

How can I add the equality constraint

$\sum\limits_{i} x_i =1$ for such a problem and perform the optimization ?

2

There are 2 best solutions below

1
On

One approach could be to rewrite it as $$\sum_i x_i - 1 = 0$$ which motivates adding the following penalty term $$\lambda_S\left\|\sum_i x_i - 1\right\|_k = \lambda_S\|{\bf Sx} - 1\|_k$$ for some suitable $k$ where $\bf S$ is a matrix for the sum operator. Basically a dot product between x vector and a vector of ones. The equality will be true when the thing inside the norm is equal to 0. Which norm $k$ we choose ( and maybe the size of a scalar weight like $\lambda_S$ ) will determine how important that it is exactly 0.

0
On

You have 2 options here:

  1. Solve the problem using ADMM and transform $ \left\| x \right\|_{1} $ into $ \left\| z \right\|_{1} $ (Classic LASSO ADMM). In the $ x $ update step add projection onto the set by $ x = x - \frac{ \boldsymbol{1}^{T} x - 1 }{n} \boldsymbol{1} $ which is the orthogonal projection onto the set.
  2. Solve a different ADMM when adding $ I_{\mathcal{S}} \left( z \right) $ which is the Indicator function of the set $ \mathcal{S} $ as defined above. This means you'll have ADMM which on one iteration solve LASSO problem with reagridng to $ x $ (Actually LASSO with Tikhonov Regularization, which is called Elastic Net Regularization) and on the other, regarding $ z $ you will have a projection operation (As in (1)).

Clearly the first approach is much easier.
If you want a code, let me know.