I would like to find x such that (Ax).^2 + (Bx).^2 == I (using Matlab syntax). A, B are matrices and I is a vector, all with real values. The number of equations is less than the number of variables, and I can add regularization (for example minimize norm(x) such that the above equation holds).
I tried using cvx (convex optimization SW), but kept getting: Disciplined convex programming error. Isn't this expression convex?
How can x be found?
Thank you!
If I understand you correctly, you are trying to solve the following optimization problem:
$$ \begin{aligned} \min_x &\quad ||x|| \\ \text{s.t.} &\quad ((Ax)_j)^2 + ((Bx)_j)^2 = I_j &~\forall j=1,\dots,m \end{aligned} $$ Your constraints are non-linear equality costraints, which makes them non-convex. Hence, CVX cannot help you (directly).
I suggest trying to solve this problem using some variation of a trust-region method. It cannot ensure that you will find a solution, but you have a chance of being close. I cannot explain what it is in one post, so you should Google it. But here is one idea.
Let $f_j(x) = ((Ax)_j)^2 + ((Bx)_j)^2$, and let $g_j(x, y) = f_j(y) + (\nabla f_j(y))^T(x - y)$. As you probably guess, $g_j$ is the first-order Taylor approximation of $f_j$ at $y$. Also, note that $g_j$ is affine (linear) in $x$. The idea is the following:
You can see that each sub-problem in (2) is convex, since $g_j$ is affine in $x$, and the radius constraint is also convex. So you can use CVX to produce $x^{k+1}$ given $x^k$. However, finding a solution is not ensured and it heavily depends on the choice of $x^0$ and $R$.