Minimizing AX - B where both A and X are variable matrices.

34 Views Asked by At

I'm trying to solve a problem of the form:

$$\text{minimize} \: \lVert AX - B \rVert^2_2$$

Subject to

$$A \geq 0 \\ X \geq 0.$$

Both $A$ and $X$ are variable and $B$ is a constant.

So essentially the problem is to factorize $B$ into $A$ and $X$ but unlike the typical recommender system problem, we know $B$ in its entirety.

Does this fall under any optimization domain? How would I go about solving this to find the best $A$ and $X$? The only approach I currently have is gradient descent but I doubt it's gonna find a good solution.

Edit: The dimensions of $A$ and $X$ are predetermined and not a free parameter.

Every element of $B$ is nonnegative.