Let $A,B\in\mathbb{R}^{m\times n}$ be matrices with all positive entries. I want to compute the following minimum. $$\min_{\vec{u}\in\mathbb{R}^m,\ \ \vec{v}\in\mathbb{R}^n} \ \ \sum_{i=1}^m \sum_{j=1}^n (u_i v_j A_{ij} - B_{ij})^2.$$
Rephrased in terms of matrix notation and the Frobenius norm $\|\cdot\|_F$, this would be the following. $$\min_{\vec{u}\in\mathbb{R}^m,\ \ \vec{v}\in\mathbb{R}^n} \ \ \| D_{\vec{u}} A D_{\vec{v}} - B\|_F^2,$$ where $D_{\vec{u}}$ is the diagonal matrix with diagonal $\vec{u}$ (and likewise for $D_{\vec{v}}$). Does anyone know how to find this minimum?
After much searching, I have found that this can be expressed as a weighted low rank approximation problem. We want to approximate the matrix $B\oslash A$ with a rank-1 matrix that minimizes the weighted Frobenius norm, with weight matrix $W$ such that $W_{ij}=A_{ij}^2$.
According to the wikipedia page, there is no analytic solution in terms of the SVD. Instead, it requires an iterative optimization method. One such method that I came up with myself (and many others have also invented) is below.
Repeat the following steps until the desired accuracy is reached (convergence is linear). $$u_i\leftarrow \frac{\sum_j v_j A_{ij} B_{ij}}{\sum_j v_j^2 A_{ij}^2},\qquad u\leftarrow u/\|u\|,\qquad v\leftarrow \frac{\sum_i u_i A_{ij} B_{ij}}{\sum_i u_i^2 A_{ij}^2}.$$