Linear optimization with constraints

19 Views Asked by At

Let $A$, $X$ and $B$ be matrices with entries from $[0, 1]$ of size $n\times m$, $m \times m$ and $n\times m$ respectively with the constraint that every row sums into $1$. Given $A$ and $B$, how can I find $X$ such that $||AX - B||\to \min$?

For arbitrary $X$ we can use multivariate logistic regression, but in this case we'll lose the constraint on $X$.

1

There are 1 best solutions below

0
On BEST ANSWER

$\textbf{Statement}$: Let $A \in \mathbb{C}^{n \times m}$, $X \in \mathbb{C}^{m \times m}$ and $B \in \mathbb{C}^{n \times n}$ such that each entry $a_{fg}, x_{ij}, b_{kl} \in [0, 1]$ and equalities hold $A \, 1_m = 1_n$, $X \, 1_m = 1_m$ and $B \, 1_n = 1_n$.

The solution for this equality is given by the product of Moore-Penrose Matrix by matrix B i.e. $A^+ \, B$, also called pseudo-inverse matrix.