Showing that the Lasso solution path is linear as $\lambda$ ranges

169 Views Asked by At

Consider Lasso problem with the Lasso parameter $\lambda$. Suppose that the set of active predictors is unchanged for $\lambda_0 \geq \lambda \geq \lambda_1$. Show that there is a vector $\gamma_0$ such that $$\hat{\theta}(\lambda) = \hat{\theta}(\lambda_0) - (\lambda - \lambda_0)\gamma_0.$$ Thus, the Lasso solution path is linear as $\lambda$ ranges from $\lambda_0$ to $\lambda_1$.

Background: Let $A$ be a real $m \times n$ matrix. The Lasso optimization problem is $$ \text{minimize} \quad \frac12 \| Ax - b \|_2^2 + \lambda \| x \|_1 $$ The optimization variable is $x \in \mathbb R^n$.

The $\ell_1$-norm regularization term encourages $x$ to be sparse, so Lasso is useful for finding a sparse vector $x$ that satisfies $Ax \approx b$. The parameter $\lambda > 0$ controls how sparse the solution to the Lasso problem is.

I am not sure how to proceed with this problem, or what technique to use in order to show that the solution path is linear.

1

There are 1 best solutions below

1
On

The question is not $\hat{\lambda}=\hat{\lambda}_0-(\lambda-\lambda_0)\gamma_0$, it should be $\hat{x}(\lambda) =\hat{x}(\lambda_0)-(\lambda-\lambda_0)\gamma_0$. Here $\hat{x}(\lambda)$ is the minimizer, which is independent the parameter $\lambda$.