I was wondering how you solve (or approximate) for B in $A=B*C$ when A and and C are known and A,B,C are not square?
It seems like you could approximate this by solving something like:
$B^* = \mathrm{argmin}_B \sum_{i,j} \left((\sum_kB_{i,k}C_{k,j}) - A_{i,j} \right)^2$
Is there a better way to solve this problem?
(By the way, I am trying to solve a collabrative filtering problem, where A is a $\mathrm{user} \times \mathrm{product}$ matrix and C is a $\mathrm{feature}\times\mathrm{product}$ matrix. I am trying to find the $\mathrm{user}\times{features}$ matrix. I would like to turn B into 2 matrices: $B=B_1B_2$, a user-preference matrix and a preference-feature matrix but I am trying to solve this simpler problem first.)
Investigating further, It seems like you have to solve this by optimization. I found 2 methods:
Method 1. Taking the derivative with respect to each $b_{i,j} \in B$.
If you add a regularization term for $b$ then the iterative solution looks like:
$$ b_i \leftarrow b_i + \gamma ((a_{j} - b_i^Tc_j)c_j - \lambda b_i)$$
(Without regularization , the $-\lambda b_i$ would be removed from the equation.)
Method 2. I also found a matrix solution of the equation (using only 1 of the updates of alternating least squares since $C$ is known):
$$ b_i = (CC^T + \lambda I)^{-1}C $$