Assume I want to minimise this
$$ \min_{x,y} \left\| A - x y^T \right\|_{\text{F}}^2$$
then I am finding best rank-$1$ approximation of $A$ in the squared-error sense and this can be done via the SVD, selecting $x$ and $y$ as left and right singular vectors corresponding to the largest singular value of $A$.
Now instead, is possible to solve the following for $b$ also fixed?
$$ \min_{x} \left\| A - x b^T \right\|_{\text{F}}^2$$
If this is possible, is there also a way to solve
$$ \min_{x} \left\| A - x b^T \right\|_{\text{F}}^2 + \left\| C - x d^T \right\|_{\text{F}}^2$$
where I think of $x$ as the best "average" solution between the two parts of the cost function?
I am of course longing for a closed-form solution but a nice iterative optimisation approach to a solution could also be useful.
This is a Convex Optimization Problem and you can easily solve it using CVX:
If you formulate your expression using the Trace Operator you'll also be able to solve it easily in other simple methods.