Modification of Standard LASSO Problem for $N$ Observations

113 Views Asked by At

The traditional LASSO problem is given as \begin{align} \arg \min_x \frac{1}{2} || y - A x ||_2^2 + \lambda ||x||_1. \end{align} where $y \in \mathbb{R}^{P \times 1}$, $A \in \mathbb{R}^{P \times Q}$, and $x \in \mathbb{R}^{Q \times 1}$. The above problem can be solved using proximal gradient descent algorithms. Suppose, we have multiple observations and an objective function of the form \begin{align} \arg \min_x \frac{1}{2} \sum_{i=1}^N || y_i - A_i x ||_2^2 + \lambda ||x||_1. \end{align} How do we solve LASSO problem?

Edit: Changed it to proximal gradient

1

There are 1 best solutions below

1
On BEST ANSWER

You can recreate the original form by defining new matrices.

Let's define:

$$ \hat{A} = \left[ {A}_{1} ; {A}_{2}; \cdots ; {A}_{N} \right] $$

Namely concatenate the matrices on the vertical axis (One beneath the other, I used MATLAB style).

Do the same for the observation vector:

$$ \hat{y} = \left[ {y}_{1} ; {y}_{2}; \cdots ; {y}_{N} \right] $$

Now, just rewrite the problem:

$$ \arg \min_{x} \frac{1}{2} \sum_{i = 1}^{N} {\left\| {A}_{i} x - {y}_{i} \right\|}_{2}^{2} + \lambda {\left\| x \right\|}_{1} = \arg \min_{x} \frac{1}{2} {\left\| \hat{A} x - \hat{y} \right\|}_{2}^{2} + \lambda {\left\| x \right\|}_{1} $$

This you know how to solve :-).