A simple least square optimization

184 Views Asked by At

I came across an unconstrained least square optimization problem as follows,

$\underset{a_i,\mathbf{x}}{\mathrm{minimize}}\quad\sum_i\left\Vert\mathbf{y}_i-a_i\mathbf{x}\right\Vert_2^2$.

I wonder is it possible to solve it in closed form? Thanks

1

There are 1 best solutions below

9
On BEST ANSWER

Let $$L(\mathbf{x},a_i ) =\sum_i\left\Vert\mathbf{y}_i-a_i\mathbf{x}\right\Vert_2^2 = \sum_i \left(\Vert \mathbf{y}_i\Vert_2^2 - a_i \mathbf{y}_i^{H}\mathbf{x} -a_i^{*}\mathbf{x}^{H}\mathbf{y}_i+ |a_i|^2 \Vert \mathbf{x}\Vert_2^2\right),$$

where $^*$ and $^H$ are the conjugate and the conjugate transpose operators.

Thus, the stationary points satisfy

$$\dfrac{\partial L}{\partial \mathbf{x}} = \sum_i -a_i \mathbf{y}_i^{H} + |a_i|^2\mathbf{x}^{H} = \mathbf{0}$$

$$\dfrac{\partial L}{\partial a_i} = -\mathbf{y}_i^{H}\mathbf{x} + a_i^{*}\Vert \mathbf{x}\Vert_2^2 = \mathbf{0}$$

From the second equation, $a_i = \dfrac{\mathbf{x}^{H}\mathbf{y}_i}{\Vert \mathbf{x}\Vert_2^2}$. Substituting partially in the first one, we have

$$\sum_i \left(-\dfrac{\mathbf{x}^{H}\mathbf{y}_i}{\Vert \mathbf{x}\Vert_2^2}\mathbf{y}_i^{H} + |a_i|^2\mathbf{x}^{H}\right) = \mathbf{0} \implies \left(\sum_i \mathbf{y}_i \mathbf{y}_i^{H}\right)\mathbf{x} = \lambda \mathbf{x},$$ with $\lambda = \sum_i|a_i|^2\Vert \mathbf{x}\Vert_2^2$, which is an eigenvalue problem.

EDIT - Additional details:

Notice that the objective can be written as

$$\sum_i \left(\Vert \mathbf{y}_i\Vert_2^2 - a_i \mathbf{y}_i^{H}\mathbf{x} -a_i^{*}\mathbf{x}^{H}\mathbf{y}_i+ |a_i|^2 \Vert \mathbf{x}\Vert_2^2\right)= \sum_i \left(\Vert \mathbf{y}_i\Vert_2^2 - a_i a_i^{*}\Vert \mathbf{x}\Vert_2^2 -a_i^{*}a_i \Vert \mathbf{x}\Vert_2^2 + |a_i|^2 \Vert \mathbf{x}\Vert_2^2\right)$$ $$ = \sum_i \left(\Vert \mathbf{y}_i\Vert_2^2 - |a_i|^2 \Vert \mathbf{x}\Vert_2^2\right) = \sum_i \left(\Vert \mathbf{y}_i\Vert_2^2\right) - \lambda.$$

Thus, take the maximum eigenvalue to minimize the objective.