Minimize the distance between two vectors depending on parameters

639 Views Asked by At

I have two vectors:

u = [2, 3, 5, 12, 16, 17, 29]
v = [v1, v2, v3, v1+v2, v1+v3, v2+v3, v1+v2+v3]

for example. Now, I am interested in finding the values [v1, v2, v3] by some optimization method, the idea being that, getting the vector v as close to u as possible, the parameters of v will be good enough for my purposes.

This looks a bit like a quadratic program, however I have no idea how to incorporate my constraints:

  1. the elements of v are combinations (as shown above), so not independent.
  2. the last element of v, namely $\sum_{i}v_i = u_{last}$, where $u_{last}$ is the last element of the vector u.

Can this be formulated as some optimization problem ?

I was thinking of starting with v1 = v2 = v3 = $\frac{u_{last}}{3}$ and incrementally changing the value of these parameters as I reduce the distance between the two vectors.

1

There are 1 best solutions below

2
On BEST ANSWER

Since we have $v_1+v_2+v_3=29$ we can eliminate $v_3$ via $$v_3=29-v_1-v_2$$

We then seek the minimum of the function $$G(v_1,v_2)=(v_1-2)^2+(v_2-3)^2+(v_1+v_2-24)^2+(v_1+v_2-12)^2+(v_2-13)^2+(v_1-12)^2$$

This can be found by setting $\nabla G = (0,0)$ and solving the resultant pair of linear equations. I get $$\boxed {(v_1,v_2)=(8,9)}$$

and simple numerical calculations confirm that small perturbations of this increase the value of $G$.