Let $A$ be a $M$ by $N$ matrix, $f$ is a column matrix with $N$ elements, $c$ is a column vector with $N$ elements that I need to solve for, and $|| \cdot ||$ is the Frobenius norm of the matrix. I need to choose $c$, such that $\sum_i c_i = 1$, that minimizes the objective function:
$|| A - \frac{Afc^T}{c^T f} ||^2$
I do not need a closed form solution, but I need to know the number of solutions to this problem. If there is a single closed form solution, what is it?
This is what I've done thus far: I define $L$ as the lagrange function, $\lambda$ is the Lagrange multiplier, and $e$ is an $N$ dimensional column vector of ones. Then
$L = || A - \frac{Afc^T}{c^T f} ||^2 + \lambda (1 - e^T c)$
The derivative of $L$ with respect to the column vector $c$ is
$\frac{2f (c^T A^T A f)}{(c^T f)^2} - \frac{2 A^T A f}{c^T f} + \frac{2 (c^T f c - c^T c f) (f^T A^T A f)}{(c^T f)^3} - \lambda e$
Setting this equal to zero and solving for $c$ is difficult (at least for me), if not impossible in closed form. I just really want to know if the solution is unique. Is there only one solution, or more than a single solution? I think the way to proceed is to calculate the hessian, and if the hessian is positive semi-definite, then I know that there is one unique solution. But how do I calculate the hessian of this? Is there another theorem or way to proceed?
The constraint is inoperative. To see this let $x=\beta c$, then for any value of $\beta$ we have $$\frac{Afx^T}{f^Tx} = \frac{Af(\beta c)^T}{f^T(\beta c)} = \frac{Afc^T}{f^Tc}$$ So let's write the problem in terms of an unconstrained vector $x$ and the matrix $$M= \frac{Afx^T}{f^Tx} - A$$ After finding $x$ I can scale it to recover $$c=\frac{x}{e^Tx}$$ Write down the cost function, scale it by half, and find its differential & gradient $$\eqalign{ \phi &= \tfrac{1}{2}{\rm tr}\big(M^TM\big) = \tfrac{1}{2}M:M \cr d\phi &= M:dM \cr &= M:\frac{Af\,dx^T}{f^Tx} - M:\frac{Afx^T}{(f^Tx)^2}\,(f^Tdx)\cr &= M^T\Big(\frac{Af}{f^Tx}\Big):dx - \Bigg(\frac{x^TM^TAf}{(f^Tx)^2}\Bigg)\,\,(f:dx) \cr \frac{\partial\phi}{\partial x} &= M^T\Big(\frac{Af}{f^Tx}\Big) - \Bigg(\frac{x^TM^TAf}{(f^Tx)^2}\Bigg)\,f \cr }$$ Setting the gradient to zero leaves a result in the form of an eigenvalue equation $$\eqalign{ \lambda\,f &= \big(M^TA\big)\,f = \Bigg(\frac{xf^TA^TA}{x^Tf}-A^TA\Bigg)\,f = \Bigg(\frac{xf^T}{x^Tf}-I\Bigg)\,A^TA\,f \cr }$$ Multiplying by $f^T$ $$\eqalign{ \lambda\,f^Tf &= \Bigg(f^T-f^T\Bigg)\,A^TA\,f = 0 \cr }$$ we see that $\lambda=0$ $\big($since $f^Tf>0\big)$.
Substitute $\lambda=0$ and rearrange the equation. $$\eqalign{ &\Bigg(\frac{xf^T}{f^Tx}\Bigg)\,(A^TA\,f) = (A^TA\,f) \\ &\Bigg(\frac{xf^T}{f^Tx}\Bigg)\,w = w \\ }$$ This yields an EV equation for a rank-${\tt1}$ matrix, which has only one solution $$\eqalign{ x &= w = (A^TAf) \\ }$$ from which we can calculate the constrained solution $$c = \frac{x}{e^Tx} = \frac{A^TAf}{e^TA^TAf}$$