Construction of a MGF for conditional random variables

383 Views Asked by At

This question is motivated by the comment under the answer in understanding of difference between weighted variables.

Let $X_i, i=1, \ldots, n$ be independent Poisson random variable with parameters $\lambda_i$ correspondingly and conditioned such that $\sum_{i=1}^nX_i=A$. Let $a_i, i=1, \ldots, n \in R$. Denote $S=\sum_{i=1}^N a_iX_i$

Find Moment generating function $M_S(t)$ for $S$.

I have been trying to go through the steps noted in the above cited answer, but, since there is an extra condition on the variables, I have not been able to succeed.

1

There are 1 best solutions below

3
On BEST ANSWER
  1. As has been pointed out in the comments, $(X_1,\ldots,X_n)$ conditioned on $\sum_{i=1}^n X_i = A$ is multinomial with $p_i = \lambda_i/\sum_j \lambda_j$, $i=1,\ldots,n$, and number of trials $A$.

  2. The MGF of a multinomial $(X_1,\ldots,X_k)$ with parameters $p_1,\ldots,p_k$ and number of trials $n$ is $$ E[e^{\sum_{i=1}^k \theta_iX_i}] = \left(\sum_{i=1}^k p_ie^{\theta_i}\right)^n $$ (Note that this is the multidimensional version of an MGF, which is why you have the linear combination.)

  3. To get the MGF of $\sum_{i=1}^n a_iX_i$ conditioned on $\sum_{i=1}^n X_i=A$, apply this result to get: $$ E[e^{\theta \sum_{i=1}^n a_iX_i}|\sum_{i=1}^n X_i=A] = E[e^{\sum_{i=1}^n (\theta a_i)X_i}|\sum_{i=1}^n X_i=A] = \left(\sum_{i=1}^n p_ie^{\theta a_i}\right)^A $$

You can see some calculations for Step 2 here: https://stats.stackexchange.com/questions/61697/moment-generating-function-of-multinomial-distribution (Note, $\sum_{i=1}^k p_i = 1$, so I think the calculations can be simplified.)