Consider the following optimization: $$ f(x)= \min \sum_{i=1}^n \left(x_i-\sum_{j=1}^n x_j\right)^2 $$
Let $g_i(x)=x_i-\sum_{j=1}^n x_j$ , then
$$ f(x)= \min \sum_{i=1}^n g_i(x)^2 $$
The determinant of Hessain matrix of $g_i(x)$ for all $i$ is zero; therefore, the Hessian matrices are positive semi-definite, and $g_i(x)$ is convex. We know that one of the eigenvalues is equal to zero as well.
To solve this quadretic optimization problem, Cplex/OPL can be used, but I am looking for a theorem or property that specifically uses for this type of optimizatin programming (determinant of Hessain matrix equal to zero, positive semi-definite, and convex).
Is there any theorem or property that can simplify optimization of this function?
Thank you very much!
You write $g_i$ as a function. Can we not just use additional variables $y$ and $s$ and write the following linear equations plus convex objective:
$$ \begin{array}{l} s = \sum_i x_i \\ y_i = x_i - s\\ \min \sum_i y^2_i \end{array} $$
This can be solved straightforwardly and efficiently with Cplex/OPL (the Q matrix is now positive definite). This formulation is somewhat optimized to make things as linear as possible and to minimize the number of nonzero elements in the constraint matrix.
Am I missing something?