I have posted this on crossvalidate but so far noone could answer.
I am tyring to decompose a noisy vector into a sum of signals. Say we have $s$ independent signals and $c$ channels. Each signal can lead to an increase of output in some, but not all of the channels with some specific strength relative to the overall signal strength of the particular signal. E.g. signal $S1$ goes into the channel $C1$ with a strenght of 0.5 (or 50% of the overall strength of $S1$), and into channel $c20$ with a strength of 0.1 etc. Other signals can also go int the same channels or into different channels.
The overall signal that we obtain is measured in all available channels simultaneously and gives me the readout vector e.g. $Y= \{C1, C2, ..., Cc\}$. If we did not have the signal mixing matrix one could use an independent component analysis for this. But I do have a mixing matrix e.g. $M$ with $c$ rows and $s$ columns, where $M[a,b]$ gives me the relative strength of the signal $Sb$ in the channel $Ca$.
What I am trying to calculate is a vector $L$ (similar to a loading vector in ICA) that would tell me what strength each signal has. The important part is that each signal can either be silent or have some positive strenght, it cannot be negative. So basically $Y=M*L+\epsilon$. Where $\epsilon$ is the error and each element of $L$ and $M$ are either positive or (mostly) 0. I tried basic matrix rules: $L=M^{-1}*Y$ with generalized inversion of M, but that results in some negative alues in $L$. I have also tried linear regression with the matrix $M$ as x, but that also gives me negative values. Is there a way to find the best possible $L$ with only positive values?
Also, my case $M$ is very sparse, there are >1000 channels and signals and each signal only goes into 1 to 50 channels and the readout $Y$ is very noisy.