Optimization with vectors

374 Views Asked by At

I don't know if there exist a simple solution. But I have the following problem: I have a vector positive $m \in (\mathbb{R}^{+})^n$ and a set of vectors (which can have negative composants) $(v_1, v_2, \cdots, v_M)$ each of this vectors are in $\mathbb{R}^n$. All theses vectors are normalize to 1.

I want the linear combinaisaon of $v^i$ with $i\in [1,M]$ which its absolute value vectors is closest to $m$. ie I want to minimize this quantity:

$$ L(\beta^i) = \sum_{k=1}^{n} \left\Vert m_k - \left\Vert \sum_{i=1}^M \beta^i v^i_k \right\Vert \right\Vert $$

and get the $\beta^i$. This feel like an optimization probleme but I have no clue where to start to solve this problem (numerically in the general case)

1

There are 1 best solutions below

0
On

Calling

$$ v^i = {v^i_1,\cdots,v^i_n}^{\top}\\ \beta = (\beta_1,\cdots,\beta_n)^{\top}\\ e_k = (0_1,\cdots, 1_k,\cdots, 0_n)^{\top}\\ V = [v^1,v^2,\cdots, v^m] $$

and assuming

$$ f(\beta) = \sum_{k=1}^n\left\lVert m_k-\left\lVert V.\beta.e_k\right\rVert^2\right\rVert^2 $$

we have the extrema conditions

$$ \frac{\partial f}{\partial \beta_k} = \sum_{k=1}^n\left(m_k-\left\lVert V.\beta.e_k\right\rVert^2\right)\left(V.\beta.e_k\right)V.e_k = 0 $$ This is a nonlinear equations system with $m$ equations and $m$ unknowns.