Closed-form solution for quadratic optimization with L2 regularization?

334 Views Asked by At

Assume I have the following matrix equations, which I want to minimize over $x$

$$\min_{x}\left(\frac{1}{2}||Mx+b||^{2}_{2}+\lambda\left<x,x\right>\right)$$

where $x$ is a variable column vector, $M$ is a constant matrix, $b$ is a constant column vector, and $\lambda$ is some user-specified, scalar regularization coefficient. The first term is a quadratic objective, the second summand $\lambda\left<x,x\right>$ is a L2-regularization term.

If it were not for this regularization term, this objective would have a closed-form solution (see the answer to this question):

$$\nabla_x (M x + b)^2=\nabla_x (b^T b + 2 x^T M^T b + x M^T M x) = 2 \left(M^T b + M^T M x \right) = 0.$$

However, since we do have L2 regularization, the optimal value of $x$ is affected by the regularization. Is it still possible to derive a closed-form solution? If yes, how would I go about this?

1

There are 1 best solutions below

0
On

The gradient of the objective function $\phi$ is $$ \frac{\partial \mathbf{\phi}}{\partial \mathbf{x}} = \mathbf{M}^T(\mathbf{Mx+b})+2\lambda \mathbf{x} = \left( \mathbf{M}^T\mathbf{M}+2\lambda \mathbf{I} \right) \mathbf{x} + \mathbf{M}^T \mathbf{b} $$ Setting to zero the gradient, you will obtain the required closed-form.