For $\mathbf{x} \in \mathbb{C}^N$, I'd like to solve the following problem:
$$ \mathbf{x}^\ast = \arg \min_{\mathbf{x}} \Vert \mathbf{Ax-b} \Vert_2 \,\,\,\,\,\, \mathrm{s.t.} \,\,\,\,\, \Vert x_i \Vert_2 = a_i, \,\,\,\, i = 0, \dots, N-1, $$
where $a_i \in \mathbb{R}$. The above is a least-squares problem where the magnitude of the elements of $\mathbf{x}$ are fixed and only their phase may vary.
Can anyone point me in the direction of how to solve this? I have tried adding the equality constraints as a penalty term to the cost function, but had no success. Though I have not found anything yet, I am hoping that is a well-studied problem with a known solution.
Thanks for any help you can provide.
Really nice and interesting question.
I tried, at first, solving it on the Real domain.
A brute force approach is my reference and I tried working with the following cost function:
$$ \arg \min_{x} f \left( x \right) = \arg \min_{x} \frac{1}{2} \left\| A x - b \right\|_{2}^{2} + \frac{\lambda}{2} \left\| \operatorname{abs} \left( x \right) - a \right\|_{2}^{2} $$
Where $ \operatorname{abs} \left( x \right) $ is element wise.
The derivative is given by:
$$ \frac{d}{d x} f \left( x \right) = {A}^{T} \left( A x - b \right) + \lambda \operatorname{sign} \left( x \right) \left( \operatorname{abs} \left( x \right) - a \right) $$
I tried Gradient Descent where I raise the value of $ \lambda $ at each iteration.
It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.
My intermediate code is given here.
I will try another 2 approaches: