I would like to minimise a function, with multiple constraints:
$$ \frac{1}{2} \|y-Ax\|_2^2 + \beta \|z\|_1 $$
subject to
$$ Bx = 0 $$ and $$ x - z = 0 $$
In my case $(B+I)$ is not a valid matrix.
Do I just form the augmented Lagrangian as:
$$ L_{\rho} = \frac{1}{2} \|y-Ax\|_2^2 + \beta \|z\|_1 + \theta^T(x-z) + \frac{\rho}{2}\|x-z\| + \nu^T(Bz) + \frac{\rho}{2}\|Bz\| $$
?
I would argue that you don't really have two separate constraints here—or rather, you don't need to consider them as two separate constraints. Treat them instead as a single equality constraint \begin{equation} \begin{bmatrix} B & 0 \\ I & -I \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix} = 0\end{equation} The good news is that your Lagrange multipliers need not change, and the inner products are exactly equivalent to yours. Only the norm terms change. The new augmented Lagrangian is \begin{equation} L_\rho = \tfrac{1}{2}\|Ax-b\|_2^2+\beta\|z\|_1+ \begin{bmatrix} \nu \\ \theta \end{bmatrix}^T \begin{bmatrix} B & 0 \\ I & -I \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix} + \tfrac{\rho}{2}\left\| \begin{bmatrix} B & 0 \\ I & -I \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix}\right\| \end{equation} Actually, my memory may be hazy, but my understanding is it is standard practice to square the norm of the constraint term: \begin{equation} L_\rho = \tfrac{1}{2}\|Ax-b\|_2^2+\beta\|z\|_1+ \begin{bmatrix} \nu \\ \theta \end{bmatrix}^T \begin{bmatrix} B & 0 \\ I & -I \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix} + \tfrac{\rho}{2}\left\| \begin{bmatrix} B & 0 \\ I & -I \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix}\right\|_2^2 \end{equation} If you had done this with your original approach above, you would find that your augmented Lagrangian is is precisely the same as this.