I have a set of $n$ linear combinations, each with $m$ parameters and desired value $b$. I want to find the set of weights $w$ which minimizes the total equations distances (e.g. the sum of distances between each of the $n$ linear combinations and their corresponding $b$s).
$ \begin{Bmatrix} x_{1,1}*w_1 + x_{1,2}*w_2 + ... + x_{1,m}*w_n \cong b_1\\ x_{2,1}*w_1 + x_{2,2}*w_2 + ... + x_{2,m}*w_n \cong b_2\\ ...\\ x_{n,1}*w_1 + x_{2,2}*w_2 + ... + x_{n,m}*w_n \cong b_n \end{Bmatrix}$
However I want to constrain w such that for every $i$, it holds that $w_i$ is $(1\geqslant w_i\geqslant 0)$.
What is the method I should be looking for such implementation?
If I properly understand the problem, you know the $x_{i,j}$'s and the $b_j$'s and you search for the $w_i$'s. This is a typical multi-linear regression.
Now, if you want to impose that $0 \leq w_i\leq 1$, you face an optimization problem with bound contraints.
Searching, I found at http://fr.mathworks.com/help/optim/ug/lsqlin.html some information about Matlab capabilities.
If you are not willing to use this, you would find plenty of available source codes for doing it since this is the simplest optimization problem (after unconstrained optimization). Have a look at http://en.wikipedia.org/wiki/Limited-memory_BFGS
http://jblevins.org/mirror/amiller/