Assume we're trying to find $A\in [-1,1]^{n\times d}, B\in [-1,1]^{m\times d}$, from an observed matrix $C\in [-d,d]^{n\times m}$, where $C=AB^T$.
The goal is to return $\widehat A, \widehat B$ such that $\widehat A \widehat B\approx C $.
If we had $n=m$, then this is a simple SVD decomposition (and deletion of the $n-d$ smallest singular values, if there's noise).
How can we address the case where the dimensions aren't the same?
Is it possble to efficiently find $\widehat A, \widehat B$ such that $\widehat A \widehat B= C $ (assuming such $A,B$ exist)?
If not, can we find $$minarg_{\widehat A, \widehat B}||C-\widehat A \widehat B||_F$$?
this seems doable by writing all $mn$ constraints and running least squares, but I'm wondering if it has a nice close form.
Are $\widehat A, \widehat B$ unique (up to rotation/scaling/permutation)?