Let $A\in\mathbb{C}^{m\times n}$, $\mathbf{b}\in\mathbb{C}^m$, and $\eta\geq 0$. Consider the problem \begin{alignat}{2} & \min_{\mathbf{x}\in\mathbb{C}^n} \ \ && \|\mathbf{x}\|_1\\ & s.t. && \|A\mathbf{x} -\mathbf{b}\|_2 \leq \eta. \end{alignat}
Show that it can be rewritten as a problem involving real variables:
\begin{alignat}{2} & \min_{\mathbf{x}\in\mathbb{R}^d} \ \ && \mathbf{c}^T\mathbf{x}\\ & s.t. && \mathbf{x}\in K\\ & && g(\mathbf{x}) \leq \eta \end{alignat}
where $K$ is a convex cone and $g$ is a convex function. Find $K$ and $g$ explicitly.
It seems simple to see that, since the 2-norm is convex, we would be able to rewrite $g$ as such. I'm having trouble finding an explicit $K$ that satisfies this. Any help?
Thanks
I think this should work: just to avoid confusion, replace all the $x$ variables in the top optimization problem with $z=(z_1,\ldots,z_n)\in \mathbb{C}^n$ and we will use $x\in \mathbb{R}^d$ to denote the variable in the bottom problem. The objective in the top part can be written as \begin{equation} \min_{z\in \mathbb{C}^n} \sum_{i=1}^n \sqrt{\text{Re}(z_i)^2+\text{Im}(z_i)^2} \end{equation}
Let $d=2n+1$, and for each $z$ as above, think of $x\in \mathbb{R}^d$ as $x=(\text{Re}(z_1),\text{Im}(z_1),\ldots,\text{Re}(z_n),\text{Im}(z_n),t)$. Let $c=(0,\ldots,0,1)$. Then, define your convex cone $K$ as \begin{equation} K=\{(y,t)\in \mathbb{R}^{2n}\times \mathbb{R}: t\geq \sum_{i=1}^n \sqrt{y_{2i-1}^2+y_{2i}^2}\}. \end{equation} Note that $K$ is precisely the epigraph of the convex function $f:\mathbb{R}^{2n}\to \mathbb{R}$ given by $f(y)=\sum_{i=1}^n \sqrt{y_{2i-1}^2+y_{2i}^2}$, so is convex. It is a cone as the inequality is invariant under scaling by nonnegative numbers. Note that for any $y\in \mathbb{R}^{2n}$, the set of vectors $x=(y,t)\in K$ is precisely those where $t$ satisfies the desired inequality, and the minimal $t$ with this property is precisely $t=\sum_{i=1}^n \sqrt{y_{2i-1}^2+y_{2i}^2}$.
It seems like you know how to take care of $g(x)\leq \eta$, namely by just rewriting the constraint in the top problem in terms of real and imaginary parts, and just ignoring the $t$ variable. The reason this is an equivalent problem is that for any $z\in \mathbb{C}^n$, we can form the corresponding vector $x=(y,t)$ satisfying the constraints as described above, where $y$ gives the real and imaginary parts of $z$ and $t$ is exactly $\sum_{i=1}^n \sqrt{\text{Re}(z_i)^2+\text{Im}(z_i)^2}$, and this has the same objective value in both problems, so the optimum in the top problem is at least the optimum in the bottom problem. Conversely, given any $x=(y,t)$ satisfying the constraints in the bottom problem, you can go backwards to form the appropriate $z$ from $y$, and the objective value in the bottom problem is only potentially larger than it is in the top problem (if $t> \sum_{i=1}^n \sqrt{y_{2i-1}^2+y_{2i}^2}$). Therefore, the optimal values are equal, and there is an explicit mapping between optimal solutions between problems.