How to find two small matrices $M_1$ and $M_2$ such that $M_1 M_2 A \approx M A$?

48 Views Asked by At

If we have a matrix $M$ and we want to find its least squares approximation as the product of two smaller (as in less rows or columns) matrices $M_1M_2$ of a given size, we can simply run SVD and pick the entries with the biggest singular values.

It's not true, however, that we can do the same to approximate $MA$ as $M_1M_2A$. For instance, if the rows of A have a mean of $0$ and $M$ is the identity, we need to run PCA on $A$ (instead of $M$) to find $M_1$ and $M_2$.

So, short of running gradient descent, is there a simple way of finding the solution?

My current best guess is $(MV)(V^T)A$, with $V$ being the top principal components of $A$. But that is obviously suboptimal.

It would also be very nice if interpreting the columns of $A$ as data samples there was a natural way of applying regularization, so that for few samples or strong regularization the solution was approximately the SVD of $M$.