Solving a Variation of Linear Least Squares - $\arg\min_{x} \frac{1}{2} {\left\| \left( \sum_{i} {A}_{i} x {b}_{i}^{T} \right) - C \right\|}_{F}^{2}$

168 Views Asked by At

How to solve the following variant of Linear Lieast Squares problem:

$$ \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| \left( \sum_{i} {A}_{i} x {b}_{i}^{T} \right) - C \right\|}_{F}^{2} $$

Where $ A \in \mathbb{R}^{m \times n} $, $ b \in \mathbb{R}^{k} $, $ C \in \mathbb{R}^{m \times k} $ and $ {\left\| \cdot \right\|}_{F}^{2} $ is the Frobenius Norm.

1

There are 1 best solutions below

0
On BEST ANSWER

I will rewrite the problem for clarity:

$$ \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| \left( \sum_{i} {A}_{i} x {b}_{i}^{T} \right) - C \right\|}_{F}^{2} $$

Where $ A \in \mathbb{R}^{m \times n} $, $ b \in \mathbb{R}^{k} $ and $ C \in \mathbb{R}^{m \times k} $.

One could derive the derivative and look for a closed form solution.

Yet I an easier solution would be using the following property of the Kronecker Product:

$$ \operatorname{Vec} \left( {A}_{i} x {b}_{i}^{T} \right) = \left( {b}_{i} \otimes {A}_{i} \right) x $$

Where $ \otimes $ is the Kronecker Product and $ \operatorname{Vec} \left( \cdot \right) $ is the Vectorization Operator.

So the above becomes:

$$\begin{aligned} \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| \left( \sum_{i} {A}_{i} x {b}_{i}^{T} \right) - C \right\|}_{F}^{2} & = \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| \left( \sum_{i} \left( {b}_{i} \otimes {A}_{i} \right) x \right) - \operatorname{Vec} \left( C \right) \right\|}_{2}^{2} \\ & = \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| \left( \sum_{i} \left( {b}_{i} \otimes {A}_{i} \right) \right) x - \operatorname{Vec} \left( C \right) \right\|}_{2}^{2} \\ & = \arg \min_{x \in \mathbb{R}^{n}} \frac{1}{2} {\left\| D x - \operatorname{Vec} \left( C \right) \right\|}_{2}^{2} \end{aligned}$$

So given your problem, all you need to do is pre calculate the matrix $ D = \sum_{i} \left( {b}_{i} \otimes {A}_{i} \right) $ and solve a regular Linear Least Squares problem.