Relationship between euclidean inner product and $\ell_1$ norm

37 Views Asked by At

I was reading the book Statistical Learning with Sparsity and the author assigns an exercise with gradient descent that presents the following simple problem

Let $\mathbf{Q}$ be a symmetric positive definite matrix, $b$ and $\beta$ vectors that belongs to $\mathbb{R}^p$, $$ \min_{\beta\in\mathbb{R}^p} \; \beta^\top \mathbf{Q}\beta + \langle \beta , b\rangle. $$ I was thinking about minimization problems that assign $\ell_p$ penalties to the vector $\beta$ as in $$ \min_{\beta\in\mathbb{R}^p} \; \beta^\top \mathbf{Q}\beta + \lambda\lvert\lvert\beta \rvert\rvert_1 $$ I know that $\lvert\lvert\beta\rvert\rvert_2=\sqrt{\langle\beta,\beta\rangle}$, however is $\lvert\lvert\beta\rvert\rvert_1=\langle\beta,\text{sign}(\beta)\rangle$ such that the first problem would be a general case of these problems?

I know that the Riesz representation theorem states something likes this, but i'm not sure.

Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

You can use duality to write $\|\beta\|_1 = \max_{\|b\|_\infty \le 1} \langle b, \beta \rangle$. Your $\ell_1$-penalized problem is equivalent to $$ \min_{\beta} \max_{\|b\|_\infty \le 1} \beta^T Q \beta + \lambda \langle b, \beta \rangle = \max_{\|b\|_\infty \le 1}\min_{\beta} \beta^T Q \beta + \lambda \langle b, \beta \rangle $$ where the inequality is called strong duality which (most likely) holds in this case. So, roughly, your problem reduces to solving infinitely many of those quadratic problems, one for each $b$ in the $\ell_\infty$ ball. Of course, you don't need to solve infinitely many problems. There will be a dual optimal solution $b^*$ that if you find it, you can solve $\min_\beta \beta^T Q \beta + \lambda \langle b^*, \beta\rangle$ to get the solution of the original problem. There are many approaches for doing that ... you can check primal-dual solvers of convex optimization problems. You can also write-down the KKT conditions and try to solve them iteratively.