Tensor decomposition of a rank-3 tensor used for low-rank construction of a 'similar' tensor

40 Views Asked by At

I am a computational physicist, trying to construct a rank-3 tensor $ \textbf{g}_{mnr} \in \mathbb{C}^{n_m\times n_n\times n_r}$, where each element represents the scattering amplitude from a state $m$ to a state $n$ by the process $S_r$, or in quantum mechanical terms, $ \textbf{g}_{mnr} = \langle \phi_n|\hat S_r| \phi_m \rangle$. Computing this tensor is computationally expensive because elements of $\textbf{g}_{mnr}$ are computed individually via high-dimensional integrals.

However, many elements of this tensor contain redundant information, many are small, etc. I could save a lot of computer power by computing elements in a different, truncated basis (since the $\hat S_r$'s are linear operators and the sets of states $n$ and $m$ are orthonormal bases), something like $ \textbf{g}_{ijk} = \langle \sum\limits_n A_n^i \phi_n|\sum\limits_r B_r^{\space k} \hat S_r|\sum\limits_m C_m^{\space j} \phi_m \rangle$, for sets of complex coefficients $A^i$, $B^k$, and $C^j$. If this basis is well-chosen, a low rank approximation (choosing a small number of $i$'s, $j$'s, $k$'s) should be sufficient to capture important information and I could easily change basis back to $\mathbf{g}_{mnr}$.

I could obtain reasonable approximations of $A^i$, $B^k$, and $C^j$ via the tensor decomposition of a tensor $\textbf{h}_{mnr}$, which has very similar structure to $\textbf{g}_{mnr}$ but is cheap to compute, and then go ahead to compute $\textbf{g}_{ijk}$ without additional calculations in the $mnr$ basis. (In the best case, $\textbf{g}_{mnr}$ is a scalar multiple of $\textbf{h}_{mnr}$, though generally it is close to a scalar multiple with some smaller dependence on $m$ and $n$).

I had hoped that by calculating the Tucker decomposition of $\textbf{h}_{mnr}$, $[\bf{H};A,B,C]$,I could use the factors to compute $\textbf{g}_{mnr} \approx [\mathbf{G};\mathbf{A}^{trunc},\mathbf{B}^{trunc},\mathbf{C}^{trunc}]$, where element $ijk$ of $\mathbf{G}$ equals $\mathbf{g}_{ijk}$, which I compute only if the corresponding $\mathbf{h}_{ijk}$ is greater than some threshold. However, when I compute all elements of $\mathbf{G}$, the Tucker reconstruction is close but not exactly equal to $\mathbf{g}_{mnr}$. I expected the reconstruction to be exact as the threshold is zero because the factors $\bf A,B,C$ are orthonormal and full-rank and they correspond to linear combinations of an orthonormal basis, but I have not been able to prove that this should or shouldn't be the case.

In short my questions are: 1. By performing the Tucker decomposition of a tensor $\mathbf{X} \in \mathbb{C}^{n_m \times n_n \times n_r}$, can one vary the elements of the core tensor $\in \mathbb{C}^{n_m \times n_n \times n_r}$ to span the space of all other tensors $\in \mathbb{C}^{n_m \times n_n \times n_r}$? With the conditions I've laid out above or otherwise? 2. Are there any other ways to come up with a "good" set of elements $\mathbf{g}_{ijk}$ I can use to reconstruct $\bf g$ using a "similar" tensor $\bf h$, especially one that reconstructs $\bf g$ exactly in some limit? Possibly using another tensor decomposition method (not Tucker) or other tricks? I would love so much as a paper link to any author who's considered a similar problem.