Suppose that we have a series of observations $x_1, x_2, ... x_n$.
Each $x_i$ is generated by $aX + bY + cZ$ where $a$, $b$, and $c$ are real constants and $X$, $Y$, and $Z$ are independent random variables.
The distribution of $X$, $Y$, and $Z$ is Poisson, with $\lambda_X = w_x/a$, $\lambda_Y = w_y/b$, and $\lambda_Z = w_z/c$, with the $w$'s being real constants. We know the sum of the $w$'s, i.e. $w_x + w_y + w_z = k$ and we know $k$. (I know, it's weird).
Is it (even) possible to estimate $a$, $b$, $c$, and $w_x$, $w_y$, and $w_z$? How would you do it?
Thank you!
Given statistic problem is restoring of the experiment's parameters combination, using the sample data. This requires a large sample size.
Will be used the Bayesian approach.
THE PROBABILITY DISTRIBUTIONS
At first, one have to collect a priory probabilities for each possible parameters combination. And every parameters combination is given by the vectors of the distrubution law parameters $$\vec \lambda = \left(\frac {w_x}a, \frac {w_y}b, \frac {w_z}c, \right)$$ with known $w_x, w_y, w_z$, and the random variables parameters $$\vec v = (a, b, c).$$
Vector $\vec\lambda$ defines the probabilities for every partial random variables vector $(x_n, y_n, y_n)$ as the probabilities production $$P(x_n, y_n, z_n,\vec\lambda) = P_1\left(x_n, \frac {w_x}a\right)P_1\left(y_n, \frac {w_y}b\right)P_1\left(z_n, \frac {w_z}c\right),$$ $$P_1(j, \lambda_k) = \frac{1}{j!}\lambda_k^je^{-\lambda_k},\tag1$$ where $P_1(j, \lambda_k)$ is the probability of obtaining exactly $j$ successes when expected number of successes is $\lambda,$ according to the Poisson distribution.
This means that values $$x_n = 0, 1,\dots,\quad y_n = 0, 1,\dots,\quad z_n = 0, 1,\dots$$ are integers.
Then $$P(x_n, y_n, z_n,\vec\lambda) = \frac{1}{x_n!y_n!z_n!} \left(\frac {w_x}a\right)^{x_n} \left(\frac{w_y}b\right)^{y_n} \left(\frac {w_z}c\right)^{z_n} e^{-\left(\frac{w_x}a + \frac{w_y}b + \frac{w_z}c \right)},$$ or $$P(x_n, y_n, z_n,\vec\lambda) = \frac{(x_n, y_n, z_n)!}{(x_n+y_n+z_n)!} \left(\frac {w_x}a\right)^{x_n} \left(\frac{w_y}b\right)^{y_n} \left(\frac {w_z}c\right)^{z_n} e^{-\left(\frac{w_x}a + \frac{w_y}b + \frac{w_z}c \right)},\tag2$$ where $$(x_n, y_n, z_n)! = \frac{(x_n + y_n +z_n)!}{x_n!y_n!z_n!}$$ are the multinomial coefficients.
Note that $$\sum\limits_{x_n + y_n + z_n = s_n}P(x_n, y_n, z_n,\vec\lambda) = \frac{1}{(x_n+y_n+z_n)!} \left(\frac {w_x}a + \frac{w_y}b + \frac {w_z}c\right)^{s_n} e^{-\left(\frac{w_x}a + \frac{w_y}b + \frac{w_z}c \right)},$$ $$\sum\limits_{x_n + y_n + z_n = s_n}P(x_n, y_n, z_n,\vec\lambda) = \frac{1}{s_n!} \lambda_s^{s_n} e^{-\lambda_s} = P_1(s_n, \lambda_s),\tag3$$ where $$\lambda_s = \frac {w_x}a + \frac{w_y}b + \frac {w_z}c.$$
ADDITIONAL CONDITIONS
The problem is not completely defined.
Easy to see that the combinations $(a, w_x)$ and $\mathrm{const}(a, w_x)$ have the same influence to the sample data and the different influent to the result, and that makes results incorrect. And the same situation take place for the pairs $(b, w_y)$ and $(c, w_z).$
Taking in account that all of that parameters are real, this makes incorrect the task of the simultaneous definition of vectors $\vec v = (a, b, c)$ and $\vec w = (w_x, w_y, w_z)$.
The similar problems take place if $a,\ b\ c$ can have different signs.
Let us consider the task when the vector $\vec w$ is given, and $a > 0, b > 0, c > 0$.
THE HYPOTHESES CONCEPT
The Bayesian approach allows to calculate conditional probabilities (a posteriory) using unconditional probabilities (a priory) and data sample, using concept of hypotheses.
Let the space of the all possible outcomes is divided into hypotheses $H_i,$ where $H_i$ is the event of $v \approx \vec v_i,$ or that $(a \approx a_i)\text{ and } (b \approx b_i)\text{ and } (c\approx c_i).$ The unconditional probabilities of $H_i$ are $$P(H_i) = \dfrac{m_i}{m},$$ where m is the total quantity of outcomes, $dV_i$ is the quantity of the outcomes favorable to the event $H_i.$
Taking in account that the sample size is bounded, it seems reasonable to discretize the task parameters by considering $$a_i = \frac{W}{I}i,\quad b_i = \frac{W}{I}i,\quad c_i = \frac{W}{I}i,$$ where $$i = 0, 1, 2\dots I,\quad W = C \max\{w_x, w_y, w_z\},\quad C \in[5, 10]$$ and to bound the values $$x_n = 0, 1, 2\dots N,\quad y_n = 0, 1, 2\dots N,\quad z_n = 0, 1, 2\dots N,$$ and that requires to discretize the sample data (Wx_n in the OP notation) to the gradations $$k = 0,\ 1\dots K,\quad K = I\cdot N.$$
This preparing gives the discretize sample data $\vec d = \{d_k = a_i x_n + b_i y_n + c_i z_n\},$ and that allows to use the relations $(2)$ and $(3).$
Thus, $$P(H_i) = 1/I,\tag4$$ $$P(d_k | H_i) = \sum\limits_{a_i x_n + b_i y_n + c_i z_n = d_k}P(x_n, y_n, z_n, \lambda_i),$$ $$P(d_k | H_i) = e^{-\left(\frac{w_x}a + \frac{w_y}b + \frac{w_z}c \right)}\sum\limits_{a_i x_n + b_i y_n + c_i z_n = d_n} \frac{(x_n, y_n, z_n)!}{(x_n+y_n+z_n)!} \left(\frac {w_x}a\right)^{x_n} \left(\frac{w_y}b\right)^{y_n} \left(\frac {w_z}c\right)^{z_n},\tag5$$ and, using $(3),$ $$\sum\limits_{k = 0}^{K}P(d_k | H_i) = 1.\tag6$$
Considering the sample data independent, easy to obtain $$P(\vec d | H_i) = \prod_{k = 0}^{K} P(d_k | H_i).\tag7$$
PARAMETERS DISTRIBUTION
Parameters distribution can be obtained by the Bayesian theorem
$$P(H_i|\vec d) = \dfrac{P(H_i)P(d_k | H_i)}{\sum\limits_{i=0}^I P(H_i)P(d_k | H_i)}.$$ Taking in account $(4),$ $$P_i = P(\vec v_i|\vec d) = \dfrac{P(\vec d | H_i)}{\sum\limits_{i=0}^I P(\vec d | H_i)}.\tag8$$
Maximum of $P_i$ determines vector $\vec{v_i}$ according the maximum likelihood method.
In the same time, formula $(8)$ defines the distrubution parameters law. If the substituton of the vector $\vec{v_i}$ to $(2)$ is planned, then it is useful to know about the existence of a posteriori law of probability distribution, defined by formula $$P(x_n, y_n, z_n |\vec d) = \sum\limits_i P(x_n, y_n, z_n, \lambda(\vec V_i))P(\vec v_i|\vec d)\tag9$$ in accordance with Fisher's fiducial approach, which accounts the influence of statistics on the form of the law of distribution.