I have the following problem in an example test for a course in optimization:
$$\begin{array}{ll} \text{minimize} & \sum_{i=1}^n w_i^2 x_i^2\\ \text{subject to} & Ax = b\end{array}$$
where $A \in \mathbb R^{m \times n}$, $b \in \mathbb R^m$ and $w_i > 0$.
A) is the problem convex?
My attempted answer: I think so. The target function is convex as a positive sums of quadratics. The constraints define a convex set (either empty, in case of over-determined, a single point in case of a determined set, or some hyperplane in case of under-determined set of equations).
B) What is the condition that the problem has a single global solution?
It might be that $m \le n$ - so that the set is not over-determined? The Hessian is PD for all x, so if there's a stationary point, it will be a global minimum.
C) Are all KKT points of the problem are global minima?
Not sure how to even find KKT here.
D) Suppose A is of full
columnrow rank - find the optimal solution.
Ditto.
Let's write out the Lagrangian, $$\ \mathcal L(x, \mu) = \sum_{i=1}^n w_i^2x_i^2 + \sum_{j=1}^m \mu_j \left(\sum_{i=1}^n A_{ji}x_i -b_j \right) $$ In matrix form this becomes $$\ \mathcal L(x, \mu) = x^TWx+\mu^T(Ax-b) $$ where $W$ is a diagonal matrix with $W_{ii}=w_i^2$. Taking gradient we get, $$\ \nabla_x \mathcal L(x, \mu) = 2Wx+A^T\mu=0\\ \implies x= -\frac{1}{2}W^{-1}A^T\mu $$ Substitute it back in the constraint to get, $$\ x^*=W^{-1}A^T\left(AW^{-1}A^T \right)^{-1}b $$ $AW^{-1}A^T$ is invertible if $A$ is full row rank. I think you should recheck the question as if $A$ is full column rank i.e. $rank(A) = n$. There may not exist any feasible point.