Assume we are given two matrices $A, B \in \mathbb R^{n \times m}$ and a vector $v \in \mathbb R^n$. Let $\|\cdot\|_F$ be the Frobenius norm of a matrix. How can we solve the following optimization problem in $X \in \mathbb R^{n \times n}$?
$$\begin{array}{ll} \text{minimize} & \|A-XB\|_F\\ \text{subject to} & Xv=0\end{array}$$
Can this problem be converted to a constrained least squares problem with the optimization variable being a vector instead of a matrix? If so, does this way work?
Are there some references about solving such constrained linear least Frobenius norm problems?
Thanks!
As the others show, the answer to your question is affirmative. However, I don't see what's the point of doing so, when the problem can actually be converted into an unconstrained least square problem.
Let $Q$ be a real orthogonal matrix that has $\frac{v}{\|v\|}$ as its last column. For instance, you may take $Q$ as a Householder matrix: $$ Q = I - 2uu^T,\ u = \frac{v-\|v\|e_n}{\|v-\|v\|e_n\|}, \ e_n=(0,0,\ldots,0,1)^T. $$ Then $Xv=0$ means that $XQe_n=0$, which in turn implies that $XQ$ is a matrix of the form $XQ = (\widetilde{X},0)$, where $\widetilde{X}\in\mathbb{R}^{n\times(n-1)}$. So, if you partition $Q^TB$ as $Q^TB = \pmatrix{\widetilde{B}\\ \ast}$, where $\widetilde{B}$ is $(n-1)\times m$, then your minimisation problem can be rewritten as the unconstrained least-square problem $$\min_{\widetilde{X}\in\mathbb{R}^{n\times(n-1)}} \|A-\widetilde{X}\widetilde{B}\|_F.$$ And the least-norm solution is given by $\widetilde{X} = A\widetilde{B}^+$, where $\widetilde{B}^+$ denotes the Moore-Penrose pseudoinverse of $\widetilde{B}$. Once $\widetilde{X}$ is obtained, you can recover $X = (\widetilde{X},0)Q^T$.