Matrix Minimization

1k Views Asked by At

I'd like to minimize the following function, but can't reach a closed form solution with respect to $C$ from the first-order partial derivative.

$||A-BCD^T||_F^2 + \frac{1}{2}||C||_F^2$

where A,B,C, and D are matrices of feasible dimensions.

For the partial derivative I get the following:

$-B^TAD + B^TBCD^TD+C \equiv 0$

But the second term is giving me trouble. Any help would be apprectiated!

2

There are 2 best solutions below

1
On BEST ANSWER

Hint: To solve on $X$ the following matrix equation

$$P + QXR + X = 0,$$

use the vectorization property: $$\text{vec}(QXR) = (R^T \otimes Q)\text{vec}(X).$$

Thus, it becomes a linear system:

$$ \left((R^T \otimes Q) + I\right)\text{vec}(X) = -\text{vec}(P).$$

Can you proceed from here?

1
On

Formally the solution is given by $\operatorname{vec}(C)=\left[(D^TD)\otimes(B^TB)+I\right]^{-1}\operatorname{vec}(B^TAD)$. If you want to avoid using Kronecker/tensor product, you may solve for $C$ as follows.

  1. Let $B^TB=USU^T$ and $D^TD=V\Lambda V^T$ be two orthogonal diagonalisations.
  2. Let $X=U^TCV$ and $M=U^TB^TADV$. Then the equation in question can be rewritten as $SX\Lambda+X=M$.
  3. Hence you may solve for $X$ entrywise: $x_{ij}=\dfrac{m_{ij}}{s_i\lambda_j+1}$. Note that, as $B^TB$ and $D^TD$ are positive semidefinite, the diagonal entries of $S$ and $\Lambda$ are nonnegative and hence the denominator $s_i\lambda_j+1$ is always positive.
  4. Now $C=UXV^T$.