Solve $\min_{\mathbf{x},\mathbf{R}}\left\|\mathbf{y}-\mathbf{x}\right\|_{\mathbf{R}}^2$ s.t. $\mathbf{A}\mathbf{x}=\mathbf{0}$; $\mathbf{R}\succeq0$

173 Views Asked by At

Dear optimization experts,

Do you have a suggestion how to solve this problem?

\begin{align} \min_{\mathbf{x}, \mathbf{R}, \mathbf{\lambda}} f(\mathbf{x}, \mathbf{R}) &= \left\| \mathbf{y} - \mathbf{x} \right\|_{\mathbf{R}}^2 + \mathbf{\lambda}^T \mathbf{A}\mathbf{x} \\ &= \left( \mathbf{y} - \mathbf{x} \right)^T \mathbf{R} \left( \mathbf{y} - \mathbf{x} \right) + \mathbf{\lambda}^T \mathbf{A}\mathbf{x}, \end{align} where $\mathbf{y} \in \mathbb{R}^{n \times 1}$ is given, $\mathbf{x} \in \mathbb{R}^{n \times 1}$, $\mathbf{R} \in \mathbb{R}^{n \times n}$, and $\mathbf{\lambda} \in \mathbb{R}^{n \times 1}$ is a Lagrange Multiplier, and $\mathbf{A} \in \mathbb{R}^{n \times n}$ is given.

If it is dependent on a single variable (and Lagrange multiplier), that is either $\mathbf{x}$ or $\mathbf{R}$, then it can be solved in closed-form. But I am not sure how to do it for more than one variable jointly, in this case. Thank you very much in advance.

EDIT:

Original constrained optimization problem \begin{align*} \text{minimize}_{\mathbf{x}, \mathbf{R}} \quad \ & \left( \mathbf{y} - \mathbf{x} \right)^T \mathbf{R} \left( \mathbf{y} - \mathbf{x} \right) \\ \text{subject to }\quad & \mathbf{A} \mathbf{x} = \mathbf{0} \end{align*}

EDIT EDIT: $\mathbf{R} \succeq 0$ is positive semi-definite (constraint).


If $\mathbf{R}$ is fixed and invertible (so it should be positive definite to be on safe side), then I can show that (also suggested by Amrit P.) \begin{align} \mathbf{x} = \mathbf{y} - \left(\mathbf{R} + \mathbf{R}^T \right)^{-1} \mathbf{A}^T \mathbf{\lambda} \ , \end{align} and $\mathbf{\lambda}$ can be obtained by plugging $\mathbf{x}$ to the constraint \begin{align} \mathbf{A}\mathbf{x} = \mathbf{0} \ , \end{align} such that \begin{align} \mathbf{\lambda} = \left(\mathbf{A} \left(\mathbf{R} + \mathbf{R}^T \right)^{-1} \mathbf{A}^T \right)^{-1} \mathbf{A} \mathbf{x} \ . \end{align}


I am still thinking, what if we want to find optimal $\mathbf{R}$ as well... perhaps I am hitting the wall :( ...

I saw a paper that also obtain $R$ but not jointly .... e.g., cf. R. Kumar and A. Tyagi, "Weighted Least Squares Based Spectral Precoder for OFDM Cognitive Radio", in IEEE Wireless Comm. Letters, Dec. 2015.

1

There are 1 best solutions below

2
On

Are you sure that $R$ is a variable in this optimization? Without that condition, it's a relatively straightforward problem.

Since you have already made the constraint a part of the optimization objective, we can take partial derivatives wrt variables of the problem ($x,R,\lambda$), and set them to zero for obtaining the optimal point.

$$\frac{\partial{f(x,R,\lambda)}}{\partial x}=-(y-x)^T(R+R^T)+\lambda^TA=0\tag{1}$$

$$\frac{\partial{f(x,R,\lambda)}}{\partial R}=(y-x)(y-x)^T=0\tag{2}$$

$$\frac{\partial{f(x,R,\lambda)}}{\partial\lambda}=x^TA=0\tag{3}$$

Equation $2$ suggests that $x=y$, which makes sense since that would immediately set the positive part of the objective to $0$.

Equation $3$ suggests that $x$ should lie in the null-space of $A^T$. Combining equations $2$ and $3$ tells us that an optimal solution exists only if $y$ belongs to the null space of $A^T$:

$$y=\sum_ic_ia_i^{\perp}=A^{\perp}c$$

where $a_i^{\perp}$ is the column vector of $A^{\perp}:A^TA^{\perp}=0$.

Now let's consider the case where the above is not true, i.e.

$$y=A^{\perp}c+v:v=Ad\neq0$$

In such a case, if you were to run the above through a convex optimizer, you would end up with-

$$x=y+\epsilon:A(v+\epsilon)=0$$

$$\min_{\epsilon,R,\lambda}\epsilon^TR\epsilon+\lambda^T(Av+A\epsilon)$$

$$\implies 2R\epsilon+A^T\lambda=0\Leftrightarrow\epsilon=-\frac{1}{2}R^{-1}A^T\lambda$$

$$A\epsilon=-Av\Leftrightarrow AR^{-1}A^T\lambda=2Av$$

We can calculate $\lambda$ and $\epsilon$ uniquely if $AR^{-1}A^T$ is invertible.