The problem of finding $$ \substack{{\rm min}\\x}\left( \|Ax-b\|^2_2+\lambda \|x\|_1\right),$$ where $\|\cdot\|_2$ and $\|\cdot\|_1$ are the $L_2$ and $L_1$ norms, respectively, is usually called the LASSO. $A$ is a matrix, $x$ and $b$ are vectors.
I believe this problem is robust against introducing some redundancy in the matrix $A$, by introducing for example in $A$ an extra column which is a linear combination of the original columns. Since the solution to LASSO is sparse, I expect this procedure does not change it much, i.e. the solution to $$\substack{{\rm min}\\y}\left( \|\tilde{A}y-b\|^2_2+\lambda \|y\|_1\right),$$ should be very close to the original one.
Can this be proved? Can I find an upper bound to how much the solution changes?
It's enlightening to think about what happens if $A$ contains two identical columns, $a_i$ and $a_j$. In that case, the objective function cannot distinguish between a solution with $x_i = 1, x_j = 0$ and a solution with $x_i = \theta, x_j = 1 - \theta$ (where $0 < \theta < 1$). It seems most likely that if either $x_i$ or $x_j$ is active in the solution found by your LASSO solver, then both will be active. In other words, the solution you find will not be as sparse as it could have been. For that reason, it can sometimes be helpful to use a non-convex sparsity-inducing regularizer, such as $\sum_i |x_i|^{1/2}$.