linear least squares -- complex observations, real estimate constraint

388 Views Asked by At

Consider the following least squares optimization problem:

$$ \hat{x} = \arg\min_x \| y - A x\|^2 $$

where the observations are complex $y\in{\cal C}^{N\times 1}$, and the complex design matrix $A\in {\cal C}^{N\times K}$ is full rank ($K$). Is there a simple closed-form solution if $x$ is constrained to be real (i.e., $x\in{\cal R}^{K \times 1}$)?

2

There are 2 best solutions below

0
On BEST ANSWER

If you write $y=y_1+iy_2$ and $A=A_1+iA_2$, then your problem is equivalent to finding:

$$\min_{x\in\mathbf{R}^K}f(x)=\lVert y_1-A_1x\rVert^2+\lVert y_2-A_2 x\rVert^2$$

Now we are left with minimizing an unconstrained convex function:

$$\nabla f(x)=-2y_1^T A_1+2x^TA_1^T A_1-2y_2^T A_2+2x^TA_2^T A_2$$

Note that $A_1^TA_1+A_2^T A_2$ is still positive definite, so the minimizer is:

$$x_0=(A_1^TA_1+A_2^TA_2)^{-1}(A_1^T y_1+A_2^T y_2)$$

0
On

Let \begin{align} f(x) := \|y - Ax\|_2^2 := \left( y - Ax \right)^*: \left( y - Ax \right), \end{align} where $()^*$ is complex conjugate.

Now, let us compute the gradient of $f(x)$ (by computing the differential first), i.e., \begin{align} df(x) &= \left[ -A^*dx: \left( y - Ax \right) \right] + \left[ \left( y - Ax \right)^*: -A dx \right] \\ &= \left[ \left( y - Ax \right): -A^*dx \right] + \left[ -A^T\left( y - Ax \right)^*: dx \right] \\ &= \left[ -A^H\left( y - Ax \right):dx \right] + \left[ -A^T\left( y - Ax \right)^*: dx \right] \\ &= \left[\left( -A^H\left( y - Ax \right) \right) + \left( -A^T\left( y - Ax \right)^*\right) \right]: dx \end{align}

The gradient is set to zero, such that \begin{align} \frac{\partial f(x)}{dx} &= A^H\left( y - Ax \right)+ A^T\left( y - Ax \right)^* = 0 \\ &\Rightarrow x = \left( A^HA + A^TA^*\right)^{-1} \left(A^Hy + A^Ty^* \right). \end{align}