Regression/Optimization with two unknown matrices

49 Views Asked by At

I the matrices $V$ and $V_L$ and want to find two matrices $M,N$ s.t. $$||M V N - V_L || \rightarrow min.$$

My first idea was to initialize $M$ and $N$ randomly and then solve the problem as: $$M_1 = arg min_M ||M V N_0 - V_L ||$$ $$N_1 = arg min_N ||M_1 V N - V_L ||$$ $$ ... $$ and so on. What do you think or are there any algorithm, which can help me here? Looked pretty much around, but couldn't find anything useful.