Bounding the Lagrange Multiplier for Quadratically Constrained Basis Pursuit

12 Views Asked by At

I am considering an optimization problem of the form $$ \underset{x}{\min} ||x||_1 \hspace{.5cm} \textrm{s.t.} \hspace{.5cm} ||Ax-y|| \leq t $$ in the context of compressed sensing, where $A \in \mathbb{R}^{m \times n}$ is the measurement matrix, $y = Ax' + \eta \in \mathbb{R}^{m}$ is the vector of noisy measurements, and $x' \in \mathbb{R}^{n}$ is an s-sparse signal I am seeking to recover. Also, assume that $t$ is known to be an upper bound on $||\eta||$. Given that $A$ is a random matrix of independent entries following the standard normal distribution, and also given the noise on $y$, I am looking to bound the range of values that the optimal solution $x^*$ could take (in order to bound its expected value). I decided to use a Lagrange multiplier $\lambda$ to get the following Lagrangian and solve via the KKT conditions: $$ L(x,\lambda) = ||x||_1 + \lambda(||Ax-y|| - t) $$ After setting the partial derivatives equal to zero and solving for x, I get $$ x = (A^TA)^{-1}A^Ty - \frac{\lambda}{t}\partial||x||_1 $$ where $\partial||x||_1$ is a valid subgradient of the l1-norm of x. Since I only want bounds on the elements of $x$, I figure it is not an issue to have the $\partial||x||_1$ term on the right-hand side since it can be bounded independent of $x$ if need be. However, I am not sure how to go about bounding the Lagrange multiplier (if that is even feasible). Please let me know if there is something I can do here to get a non-trivial bound on the values that $\lambda$ and/or the elements of $x$ can take.