Reducing LASSO problem size on the fly with TFOCS

101 Views Asked by At

Suppose I'm using the TFOCS software package to solve the LASSO problem \begin{equation*} \text{minimize} \quad \frac12 \| Ax - b \|_2^2 + \gamma \| x \|_1. \end{equation*} The optimization variable is $x \in \mathbb R^n$, and the optimal value $x^\star$ is sparse. As the TFOCS iteration progresses, TFOCS will "decide" that certain components of $x$ should be equal to $0$. As a heuristic to speed up convergence, I would like to omit those components of $x$ and the corresponding columns of $A$ from the optimization problem once every $50$ iterations. If $A$ is large, this could reduce the size of the problem considerably.

Is there a nice way to do this pruning when using TFOCS? Is the best way to just have a loop where I call TFOCS with its maximum number of iterations set to $50$, then prune columns of $A$, then call TFOCS again with maximum number of iterations set to $50$, and so on?

I think that restarting TFOCS every 50 iterations might be a bad idea because it will reset certain internal algorithm parameters (maybe a momentum parameter, for example).

Assuming that I'm using the Nesterov 1983 algorithm, should I make a modified version of tfocs_N83.m where affineF, projector F, x_old and A_x_old are updated (according to the pruning) every 50 iterations? At a glance, I think I don't need to modify smoothF, z_old, and A_z_old. But, I might be misunderstanding the algorithm.

1

There are 1 best solutions below

1
On BEST ANSWER

I think the thing to do is to make a modified version of tfocs_N83.m where the variables x_old, z_old, apply_projector, and apply_linear are updated (to account for the pruning) every 50 iterations.

Note that the variables affineF and projectorF do not need to be updated; rather, apply_linear and apply_projector should be updated.

I tried this and it's working well. The TFOCS code is easy to read and modify.