$g(x)=\sum_{v=1}^k c_v\|x-x_v\|_2^2, \ \ c_v>0 \in \mathbb{R};\ x,x_v \in \mathbb{R}^n$.
I need to find the minimum of $g$ and reason why it is a global minimum.
I tried the following:
$g(x)=\sum_{v=1}^k c_v\|x-x_v\|_2^2=\sum_{v=1}^kc_v \Big((x_1-x_{v_1})^2+ \ldots+ (x_n-x_{v_n})^2\Big)= \sum_{v=1}^k c_v \sum_{i=1}^n(x_i-x_{v_i})^2$
So $\nabla g(x)=\Big(\frac{\partial g(x)}{\partial x_1}, \ldots,\frac{\partial g(x)}{\partial x_n}\Big)=\Big(\sum_{v=1}^k c_v \sum_{i=1}^n2(x_i-x_{v_i}), \ldots,\sum_{v=1}^k c_v \sum_{i=1}^n2(x_i-x_{v_i})\Big)$
So it is a row vector with $n$ columns and each of them is $\sum_{v=1}^k c_v \sum_{i=1}^n2(x_i-x_{v_i})$.
Is that notation with $\nabla g(x)$ correct? I still got some troubles with it.
So $\nabla g(x)=0 \Leftrightarrow \sum_{v=1}^k c_v \sum_{i=1}^n2(x_i-x_{v_i})=0 \Leftrightarrow \sum_{v=1}^k c_v 2(x-x_v)=0\Leftrightarrow x\sum_{v=1}^kc_v=\sum_{v=1}^kc_vx_v \Leftrightarrow x=\frac{\sum_{v=1}^k c_vx_v}{\sum_{v=1}^k c_v}=:\bar x$ which should be the weighted average.
In order to be a minimum the Hessian matrix at $\bar x$ has to be positiv definite. How does the hessian of $g(x)$ look like? And then how can I reason the $\bar x$ isn't only a local minimum and that it really is the global minimum? (Or are there other ways to argue that $\bar x$ is a local and global minimum?)