How to decompose a matrix as the sum of Kronecker products?

2.1k Views Asked by At

I encounter a problem revalent to Kronecker product (KP).

I want to decompose $A=\sum^r_{i=1}B_i\otimes C_i$, where $A\in \mathbb{R}^{8\times2}, B_i\in \mathbb{R}^{4\times2}, C_i\in \mathbb{R}^{2\times1}.$

I note that some matrices can be decomposed as a single KP $(r=1)$.

like, $A_1=\begin{bmatrix}1& 0\\1& 0\\0& 1\\0& 1\\1& 0\\1& 0\\0& 1\\0& 1\end{bmatrix}=\begin{bmatrix}1& 0\\0& 1\\1& 0\\0& 1\end{bmatrix}\otimes\begin{bmatrix}1\\1\end{bmatrix}$.

However, there are also many matrices need the sum of more than one KPs $(r>1)$.

like, $A_2=\begin{bmatrix}1& 0\\0& 0\\0& 1\\0& 0\\1& 0\\1& 0\\0& 1\\0& 1\end{bmatrix}=\begin{bmatrix}1& 0\\0& 1\\0& 0\\0& 0\end{bmatrix}\otimes\begin{bmatrix}1\\0\end{bmatrix}+\begin{bmatrix}0& 0\\0& 0\\1& 0\\0& 1\end{bmatrix}\otimes\begin{bmatrix}1\\1\end{bmatrix}$.

I know the above examples can be decomposed to a more number of the sum of KPs, but my question is how to decompose a matrix as the sum of KPs with least $r$.

Remark: $A$, and the size of $B_i$ and $C_i$ are known; $B_i$, $C_i$, $r$ are unknown and expected to be solved.

2

There are 2 best solutions below

2
On

For a fixed matrix $C$, a Kronecker product of the form $B\otimes C$ is just any block matrix where each block is a scalar multiple of $C$ (the scalars being the entries of $B$). So, to write a matrix $A$ as a sum of Kronecker products $\sum_{i=1}^r B_i\otimes C_i$, the $C_i$ must be a collection of matrices that span all the $C$-shaped blocks in $A$, and then the $B_i$ are just the coefficients you use to write each $C$-shaped block as a linear combination of the $C_i$. In your case, $C$ is $2\times 1$ so the space of all possible $C$-shaped blocks is only $2$-dimensional, so you always have $r\leq 2$.

Explicitly, using $r=2$, you can always just take $C_1=\begin{bmatrix}1 \\ 0\end{bmatrix}$ and $C_2=\begin{bmatrix} 0 \\ 1\end{bmatrix}$. Then $B_1$ and $B_2$ will just be the submatrices of $A$ formed by its odd and even rows, respectively. If one of $B_1$ and $B_2$ happens to be a scalar multiple of the other (say $B_2=aB_1$), then you can simplify $A=B_1\otimes C_1+B_2\otimes C_2$ down to just $B_1\otimes (C_1+aC_2)$ and get a solution with $r=1$. (In terms of the description in the previous paragraph, this is the case where every $2\times 1$ block in $A$ is a scalar multiple of $C_1+aC_2=\begin{bmatrix}1\\a\end{bmatrix}$, so there is a single matrix that spans all the $2\times 1$ blocks of $A$.)

4
On

In fact, it is possible with at most $2$ components but not with the dimensions you need.

Indeed, the Singular Values Decomposition SVD:

$$A=U \Sigma V^T$$

where:

$$ U:=(U_1|U_2|...|U_8) \ \text{is} \ 8 \times 8$$

$$V:=(V_1|V_2) \ \text{is} \ 8 \times 2$$

$$\Sigma:=diag(\underbrace{\sigma_1,\sigma_2,0,0...0)}_{8 \ \text{terms}}$$

can be written under the following form:

$$A=\underbrace{\sigma_1 U_1}_{B_1}\underbrace{V_1}_{C_1}+\underbrace{\sigma_2 U_2}_{B_2}\underbrace{V_2}_{C_2}$$

$$A=B_1 \otimes C_1 +B_2 \otimes C_2$$