Can a vector be decomposed (non-uniquely) into tensor products of smaller vectors?

346 Views Asked by At

I seek an extension of the solution found in this question, which refers to the "Nearest Kronecker Product".

Given $A\in \mathbb R^{m\times n} $ with $m = m_1m_2m_3$ and $n = n_1n_2n_3$, find $B\in \mathbb R^{m_1\times n_1}$ and $C\in \mathbb R^{m_2\times n_2}$ and $D\in \mathbb R^{m_3\times n_3}$ so that

$\phi(B,C,D)$ = min $|| A- B\otimes C \otimes D||_F$, where $F$ denotes Frobenius norm.

Less precisely, I seek a way to decompose $A$ into $B\otimes C \otimes D$. The solution need not be unique. For simplicity, it can be assumed $m_1=m_2=m_3$ and that $n_1=n_2=n_3$.

Further, although the above is general, I really only seek a solution where $n_1=n_2=n_3=1$

1

There are 1 best solutions below

0
On

You can reformulate your problem into a multi-way / tensor rank one approximation problem: if you reshape your given $A$ into a three-way matrix / tensor of size $m_1 n_1 \times m_2 n_2 \times m_3 n_3$ then your problem is equivalent to finding vectors $b \in \mathbb{R}^{m_1 n_1 }$, $c \in \mathbb{R}^{m_2 n_2 }$, and $d \in \mathbb{R}^{m_3 n_3}$, such that $\| A - b \circ c \circ d\|$ is minimized, where $\circ$ is the outer product operator.

This is a three-way rank one approximation problem. Unfortunately, there is no direct extension of the Eckart Young theorem, so a truncated (three-way) SVD will not provide the optimum solution (however, it can be a good initial guess, depending on $A$). The optimal solution can be found iteratively, e.g., via the Higher Order Orthogonal Iterations (HOOI) algorithm [1].

We've referred to this as (Least Squares) Kronecker Product factorization in the past, to drop another name for it. This works in an arbitrary number of dimensions.

[1] De Lathauwer, Lieven; De Moor, Bart; Vandewalle, Joos, On the best rank-1 and rank-((R_1,R_2,. . .,R_N)) approximation of higher-order tensors, SIAM J. Matrix Anal. Appl. 21, No. 4, 1324-1342 (2000). ZBL0958.15026.