Suppose that $\mathbf{a}$ is a $N$-dimentional column vector and $\mathbf{a}=[a_1,a_2,...,a_N]^{T}$, here $a_i \ge 0 $ for $i\in [1,N]$ and $\mathbf{W}$ is a symmetry matrix. I want to ask if there is any reference on how to solve the following optimization problems? \begin{equation} \begin{aligned} &\min_{\mathbf{a}}\quad \mathbf{a}^{T}\mathbf{W}\mathbf{a}, \\ &\text{subject to} \quad \mathbf{1}^{T}_{N} \mathbf{a}=1, a_{i} \ge 0. \end{aligned} \end{equation} Here $\mathbf{1}^{T}_{N}$ is a $N$-dimentional column vector with all 1's element, e.g. $\mathbf{1}_{N}=[1,1,...,1]^{T}$. I found a reference [1] where he said that the closed solution to this problem is
\begin{equation} \begin{aligned} \mathbf{a} = \frac{\mathbf{W}^{-1} \mathbf{1}_N}{\mathbf{1}_N^{\mathrm{T}} \mathbf{W}^{-1} \mathbf{1}_N} \end{aligned} \end{equation} However, I'm confused about this, because $\mathbf{W}$ is just a symmetric matrix, $\mathbf{W}^{-1}$ may be not existed, so we don't necessarily have closed form solutions. Is there a better way to solve this problem?
[1] Zhao, Xiaochuan, and Ali H. Sayed. "Clustering via diffusion adaptation over networks." 2012 3rd International Workshop on Cognitive Information Processing (CIP). IEEE, 2012.
===========================Update================================ Thanks the reply for @cdalitz, and you are right. I want to talk about how I think this problem is solved.
From my opinion, I think the solution for this problem is complicated whether $\mathbf{W}$ is invertible or not.
- When $\mathbf{W}$ is not invertible, as @cdalitz said, the solution is not unique, I'm not sure if we can solve it in the form of a pseudo-inverse matrix. But, at this point, I think a good method is to use the gradient descent method to solve.
- When $\mathbf{W}$ is invertible, it's also very complicated to find the inverse of $\mathbf{W}$. I noticed that the paper[1] also provided an approximate solution by removing the non-diagonal elements of $\mathbf{W}$. I think the reason why he did like that is to make sure that $\mathbf{W}$ is invertible and to avoid solving for the inverse of $\mathbf{W}$.
If the matrix $W$ has a rank less than $N-1$, the solution will not be unique. Consider e.g. the case $$W=\left(\begin{array}{ccc}0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1\end{array}\right)$$ The solution is any vector $\vec{a}=(a_1,a_2,0)^T$ with $a_1+a_2=1$, which are infinitely many. For rank $N-1$, I cannot immediately construct a similar example, but I guess the solution is ambigous in this case, too.