Suppose I have a matrix $T \in \mathbb{R}^{n \times n}$ of the form:
$$T = \sum_{i = 1}^{n_A} \sum_{j = 1}^{n_B} \sigma_{ij} a_i b_j^T$$
where $1 \leq n_A, n_B \leq n$, $\sigma_{ij} \in \mathbb{R}\setminus\{0\}$, and $\{a_i\}_{i = 1}^{n_A}$ and $\{b_j\}_{j = 1}^{n_B}$ some columns from some orthogonal matrices $A, B \in \mathbb{R}^{n \times n}$.
Suppose additionally that I know all of the $a_i$. Then is there a way of finding what the $b_j$ could be?
We can rewrite this equation as $T = A\Sigma B^\top$, where $A$ is the matrix whose columns are $a_i$, $\Sigma$ is the matrix whose entries are $\sigma_{ij}$, and $B$ is the matrix whose columns are $b_j$. With that stated, we can put the known matrices on the right to get $$ \Sigma B^\top = A^\top T. $$ In other words, the problem amounts to writing the matrix $A^\top T$ in the form $\Sigma B^\top $, where $\Sigma$ and $B$ is orthogonal.
I am not sure how one would solve this problem systematically, but I suspect that for random choices of orthogonal $B$, the matrix $\Sigma = A^\top T B$ (which satisfies the equation for the given $A,B,T$) is very likely to have non-zero entries.