Decomposition of order 3 tensor symmetric along two dimensions

80 Views Asked by At

I have a 3rd order tensor $\mathbf{A}$ consisting of symmetric covariance matrices (with dimensions of space by space) stacked in time. I would like to compute the leading spatial features that dominate the evolution of these covariance matrices.

My procedure so far has been to 1) String out the covariances into vectors to construct a 2-d nonsquare matrix, from which I can compute singular vectors that are the (vectorized) covariance changes that dominate the evolution, and then 2) Compute the eigenvectors of the leading covariance matrices.

It seems that doing two eigenvector decompositions this way is suboptimal.

Does a procedure exist find eigenvectors $\mathbf{u}_i$ such that I can write my 3rd order tensor as a sum of outer products \begin{align} \mathbf{A}=\sum_{i=1}^K \lambda_i(\mathbf{u}_i \otimes \mathbf{u}_i \otimes \mathbf{v}_i), \end{align} where $\mathbf{v}_i$ is a "principal component", $K$ is a tensor rank, and $\lambda_i$ is an eigenvalue-like weight?