What is the easiest way to optimize the weighted sum of L2 norms?

684 Views Asked by At

I have the following cost function (solving for $M$ - the $x_i$s are known):

minimize $\sum_i\sum_j(w_{ij} \cdot (x_i-x_j)^T\cdot M\cdot(x_i-x_j))$

($w_{ij} \in [-1,1] $)

subject to: $M \succeq 0$ ($M$ is positive semi definite)

Here is where I m having trouble:

The $(x_i-x_j)^T\cdot M\cdot(x_i-x_j)$ part is convex (since it is essentially squared L-2 norm in a space transformed by M). And the weighted sum of convex functions is also convex as long as the weights are positive. But since the $w_ij$s can be negative, I think the overall cost function is non-convex.

I was wondering if there is a better way to formulate this to make is convex or if there is a way to solve this problem as is?

I am fairly new to convex optimization. Any help would be appreciated!

Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

Assuming that the numbers $w_{ij} $ are not variables, each term in the objective function is a linear function of $M $. That's true even if $w_{ij} < 0$ for some $i,j $. So the objective function is convex.