Minimization of a weighted least-squares problem by Lagrange multiplier method

334 Views Asked by At

Problem:

Let $Y = (y_1, y_2, \dots, y_m) \in \mathbb{R}^{m \times n}$ and $k \in \mathbb{R}^{m}$ satisfy $\sum_{i=1}^{m} k_i =1$ and $k \geq 0$. Show that $x=Yk$ is a minimizer for $h(x) = \dfrac{1}{2} \sum_{i=1}^{m} k_i \left\|x-y_i \right\|_{2}^{2}.$

My trial:

Define the Lagrange function: $$\mathcal{L}(x, \lambda) = \dfrac{1}{2} \sum_{i=1}^{m} k_i \left\|x-y_i \right\|_{2}^{2} - \lambda \sum_{i=1}^{m} k_i-1.$$

Then, differentiating with respect to $x$ and set it to be zero:

$$\nabla \mathcal{L}_{x} = \sum_{i=1}^{m} k_i \left\|x-y_i \right\|_{2} = 0.$$

My puzzle is that:

(1) Should I thus set $x=y_i$ as the solution of $\nabla \mathcal{L}_{x}=0$?

(2) How does this relate to the constraint $\sum_{i=1}^{m} k_i =1$?

I appreciate so much for your help.