Is There a Connection Between Minimum $ {L}_{1} $ Norm Solution and LASSO?

246 Views Asked by At

I am reading a book about sparsity Statistical Learning with Sparsity: The Lasso and Generalizations. I want to know the relationship between the following two optimization problem: $$\min_{\beta} \| \beta\|_{1} \; s.t. X\beta=y$$ and $$\arg \min_{\beta} \frac{1}{2n}\|X\beta-y \|_{2}^{2}+\lambda \| \beta\|_{1}$$where $X \in \mathbb{R}^{n\times d}$, $y\in\mathbb{R}^{n}$, $\beta \in\mathbb{R}^{d}$, d>n and $\lambda > 0$.

Can we say that Lasso is the relaxed version of the minimum $L_{1}$ norm solution?

1

There are 1 best solutions below

0
On

I may have found the answer in this bookHigh-Dimensional Statistics: A Non-Asymptotic Viewpoint. In Page 206 of this book, we know that the second optimization problem is equivalent to the following optimization problem:

$$\min_{\beta} \| \beta\|_{1} \; s.t. \frac{1}{2n} \|X\beta-y\|_{2}^{2}\leq b^{2}$$ for some noise tolerance $b>0$.

Therefore, we say that Lasso is the relaxed version with noise of the minimum $L_{1}$ norm solution.

However, if without noise, how can we adjust $\lambda$ so that the solution of the second optimization problem is equivalent to the solution of the first optimization problem???